Jan 23 09:00:48 localhost kernel: Linux version 5.14.0-661.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026
Jan 23 09:00:48 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Jan 23 09:00:48 localhost kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 09:00:48 localhost kernel: BIOS-provided physical RAM map:
Jan 23 09:00:48 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Jan 23 09:00:48 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Jan 23 09:00:48 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Jan 23 09:00:48 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Jan 23 09:00:48 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Jan 23 09:00:48 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Jan 23 09:00:48 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Jan 23 09:00:48 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Jan 23 09:00:48 localhost kernel: NX (Execute Disable) protection: active
Jan 23 09:00:48 localhost kernel: APIC: Static calls initialized
Jan 23 09:00:48 localhost kernel: SMBIOS 2.8 present.
Jan 23 09:00:48 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Jan 23 09:00:48 localhost kernel: Hypervisor detected: KVM
Jan 23 09:00:48 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Jan 23 09:00:48 localhost kernel: kvm-clock: using sched offset of 3915146415 cycles
Jan 23 09:00:48 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Jan 23 09:00:48 localhost kernel: tsc: Detected 2800.000 MHz processor
Jan 23 09:00:48 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Jan 23 09:00:48 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Jan 23 09:00:48 localhost kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Jan 23 09:00:48 localhost kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Jan 23 09:00:48 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Jan 23 09:00:48 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Jan 23 09:00:48 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Jan 23 09:00:48 localhost kernel: Using GB pages for direct mapping
Jan 23 09:00:48 localhost kernel: RAMDISK: [mem 0x2d426000-0x32a0afff]
Jan 23 09:00:48 localhost kernel: ACPI: Early table checksum verification disabled
Jan 23 09:00:48 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Jan 23 09:00:48 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:48 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:48 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:48 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Jan 23 09:00:48 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:48 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Jan 23 09:00:48 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Jan 23 09:00:48 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Jan 23 09:00:48 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Jan 23 09:00:48 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Jan 23 09:00:48 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Jan 23 09:00:48 localhost kernel: No NUMA configuration found
Jan 23 09:00:48 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Jan 23 09:00:48 localhost kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Jan 23 09:00:48 localhost kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Jan 23 09:00:48 localhost kernel: Zone ranges:
Jan 23 09:00:48 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Jan 23 09:00:48 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Jan 23 09:00:48 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 09:00:48 localhost kernel:   Device   empty
Jan 23 09:00:48 localhost kernel: Movable zone start for each node
Jan 23 09:00:48 localhost kernel: Early memory node ranges
Jan 23 09:00:48 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Jan 23 09:00:48 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Jan 23 09:00:48 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000023fffffff]
Jan 23 09:00:48 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Jan 23 09:00:48 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Jan 23 09:00:48 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Jan 23 09:00:48 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Jan 23 09:00:48 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Jan 23 09:00:48 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Jan 23 09:00:48 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Jan 23 09:00:48 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Jan 23 09:00:48 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Jan 23 09:00:48 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Jan 23 09:00:48 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Jan 23 09:00:48 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Jan 23 09:00:48 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Jan 23 09:00:48 localhost kernel: TSC deadline timer available
Jan 23 09:00:48 localhost kernel: CPU topo: Max. logical packages:   8
Jan 23 09:00:48 localhost kernel: CPU topo: Max. logical dies:       8
Jan 23 09:00:48 localhost kernel: CPU topo: Max. dies per package:   1
Jan 23 09:00:48 localhost kernel: CPU topo: Max. threads per core:   1
Jan 23 09:00:48 localhost kernel: CPU topo: Num. cores per package:     1
Jan 23 09:00:48 localhost kernel: CPU topo: Num. threads per package:   1
Jan 23 09:00:48 localhost kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Jan 23 09:00:48 localhost kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Jan 23 09:00:48 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Jan 23 09:00:48 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Jan 23 09:00:48 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Jan 23 09:00:48 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Jan 23 09:00:48 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Jan 23 09:00:48 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Jan 23 09:00:48 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Jan 23 09:00:48 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Jan 23 09:00:48 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Jan 23 09:00:48 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Jan 23 09:00:48 localhost kernel: Booting paravirtualized kernel on KVM
Jan 23 09:00:48 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Jan 23 09:00:48 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Jan 23 09:00:48 localhost kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Jan 23 09:00:48 localhost kernel: pcpu-alloc: s225280 r8192 d28672 u262144 alloc=1*2097152
Jan 23 09:00:48 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Jan 23 09:00:48 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Jan 23 09:00:48 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 09:00:48 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64", will be passed to user space.
Jan 23 09:00:48 localhost kernel: random: crng init done
Jan 23 09:00:48 localhost kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Jan 23 09:00:48 localhost kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Jan 23 09:00:48 localhost kernel: Fallback order for Node 0: 0 
Jan 23 09:00:48 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Jan 23 09:00:48 localhost kernel: Policy zone: Normal
Jan 23 09:00:48 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Jan 23 09:00:48 localhost kernel: software IO TLB: area num 8.
Jan 23 09:00:48 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Jan 23 09:00:48 localhost kernel: ftrace: allocating 49417 entries in 194 pages
Jan 23 09:00:48 localhost kernel: ftrace: allocated 194 pages with 3 groups
Jan 23 09:00:48 localhost kernel: Dynamic Preempt: voluntary
Jan 23 09:00:48 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Jan 23 09:00:48 localhost kernel: rcu:         RCU event tracing is enabled.
Jan 23 09:00:48 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Jan 23 09:00:48 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Jan 23 09:00:48 localhost kernel:         Rude variant of Tasks RCU enabled.
Jan 23 09:00:48 localhost kernel:         Tracing variant of Tasks RCU enabled.
Jan 23 09:00:48 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Jan 23 09:00:48 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Jan 23 09:00:48 localhost kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 09:00:48 localhost kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 09:00:48 localhost kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Jan 23 09:00:48 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Jan 23 09:00:48 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Jan 23 09:00:48 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Jan 23 09:00:48 localhost kernel: Console: colour VGA+ 80x25
Jan 23 09:00:48 localhost kernel: printk: console [ttyS0] enabled
Jan 23 09:00:48 localhost kernel: ACPI: Core revision 20230331
Jan 23 09:00:48 localhost kernel: APIC: Switch to symmetric I/O mode setup
Jan 23 09:00:48 localhost kernel: x2apic enabled
Jan 23 09:00:48 localhost kernel: APIC: Switched APIC routing to: physical x2apic
Jan 23 09:00:48 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Jan 23 09:00:48 localhost kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Jan 23 09:00:48 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Jan 23 09:00:48 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Jan 23 09:00:48 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Jan 23 09:00:48 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Jan 23 09:00:48 localhost kernel: Spectre V2 : Mitigation: Retpolines
Jan 23 09:00:48 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Jan 23 09:00:48 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Jan 23 09:00:48 localhost kernel: RETBleed: Mitigation: untrained return thunk
Jan 23 09:00:48 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Jan 23 09:00:48 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Jan 23 09:00:48 localhost kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Jan 23 09:00:48 localhost kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Jan 23 09:00:48 localhost kernel: x86/bugs: return thunk changed
Jan 23 09:00:48 localhost kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Jan 23 09:00:48 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Jan 23 09:00:48 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Jan 23 09:00:48 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Jan 23 09:00:48 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Jan 23 09:00:48 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Jan 23 09:00:48 localhost kernel: Freeing SMP alternatives memory: 40K
Jan 23 09:00:48 localhost kernel: pid_max: default: 32768 minimum: 301
Jan 23 09:00:48 localhost kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Jan 23 09:00:48 localhost kernel: landlock: Up and running.
Jan 23 09:00:48 localhost kernel: Yama: becoming mindful.
Jan 23 09:00:48 localhost kernel: SELinux:  Initializing.
Jan 23 09:00:48 localhost kernel: LSM support for eBPF active
Jan 23 09:00:48 localhost kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 09:00:48 localhost kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Jan 23 09:00:48 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Jan 23 09:00:48 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Jan 23 09:00:48 localhost kernel: ... version:                0
Jan 23 09:00:48 localhost kernel: ... bit width:              48
Jan 23 09:00:48 localhost kernel: ... generic registers:      6
Jan 23 09:00:48 localhost kernel: ... value mask:             0000ffffffffffff
Jan 23 09:00:48 localhost kernel: ... max period:             00007fffffffffff
Jan 23 09:00:48 localhost kernel: ... fixed-purpose events:   0
Jan 23 09:00:48 localhost kernel: ... event mask:             000000000000003f
Jan 23 09:00:48 localhost kernel: signal: max sigframe size: 1776
Jan 23 09:00:48 localhost kernel: rcu: Hierarchical SRCU implementation.
Jan 23 09:00:48 localhost kernel: rcu:         Max phase no-delay instances is 400.
Jan 23 09:00:48 localhost kernel: smp: Bringing up secondary CPUs ...
Jan 23 09:00:48 localhost kernel: smpboot: x86: Booting SMP configuration:
Jan 23 09:00:48 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Jan 23 09:00:48 localhost kernel: smp: Brought up 1 node, 8 CPUs
Jan 23 09:00:48 localhost kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Jan 23 09:00:48 localhost kernel: node 0 deferred pages initialised in 10ms
Jan 23 09:00:48 localhost kernel: Memory: 7763888K/8388068K available (16384K kernel code, 5797K rwdata, 13916K rodata, 4200K init, 7192K bss, 618356K reserved, 0K cma-reserved)
Jan 23 09:00:48 localhost kernel: devtmpfs: initialized
Jan 23 09:00:48 localhost kernel: x86/mm: Memory block size: 128MB
Jan 23 09:00:48 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Jan 23 09:00:48 localhost kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Jan 23 09:00:48 localhost kernel: pinctrl core: initialized pinctrl subsystem
Jan 23 09:00:48 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Jan 23 09:00:48 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Jan 23 09:00:48 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Jan 23 09:00:48 localhost kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Jan 23 09:00:48 localhost kernel: audit: initializing netlink subsys (disabled)
Jan 23 09:00:48 localhost kernel: audit: type=2000 audit(1769158846.456:1): state=initialized audit_enabled=0 res=1
Jan 23 09:00:48 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Jan 23 09:00:48 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Jan 23 09:00:48 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Jan 23 09:00:48 localhost kernel: cpuidle: using governor menu
Jan 23 09:00:48 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Jan 23 09:00:48 localhost kernel: PCI: Using configuration type 1 for base access
Jan 23 09:00:48 localhost kernel: PCI: Using configuration type 1 for extended access
Jan 23 09:00:48 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Jan 23 09:00:48 localhost kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Jan 23 09:00:48 localhost kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Jan 23 09:00:48 localhost kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Jan 23 09:00:48 localhost kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Jan 23 09:00:48 localhost kernel: Demotion targets for Node 0: null
Jan 23 09:00:48 localhost kernel: cryptd: max_cpu_qlen set to 1000
Jan 23 09:00:48 localhost kernel: ACPI: Added _OSI(Module Device)
Jan 23 09:00:48 localhost kernel: ACPI: Added _OSI(Processor Device)
Jan 23 09:00:48 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Jan 23 09:00:48 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Jan 23 09:00:48 localhost kernel: ACPI: Interpreter enabled
Jan 23 09:00:48 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Jan 23 09:00:48 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Jan 23 09:00:48 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Jan 23 09:00:48 localhost kernel: PCI: Using E820 reservations for host bridge windows
Jan 23 09:00:48 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Jan 23 09:00:48 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Jan 23 09:00:48 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [3] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [4] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [5] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [6] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [7] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [8] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [9] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [10] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [11] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [12] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [13] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [14] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [15] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [16] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [17] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [18] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [19] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [20] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [21] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [22] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [23] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [24] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [25] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [26] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [27] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [28] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [29] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [30] registered
Jan 23 09:00:48 localhost kernel: acpiphp: Slot [31] registered
Jan 23 09:00:48 localhost kernel: PCI host bridge to bus 0000:00
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Jan 23 09:00:48 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Jan 23 09:00:48 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Jan 23 09:00:48 localhost kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Jan 23 09:00:48 localhost kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Jan 23 09:00:48 localhost kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Jan 23 09:00:48 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Jan 23 09:00:48 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Jan 23 09:00:48 localhost kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Jan 23 09:00:48 localhost kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Jan 23 09:00:48 localhost kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Jan 23 09:00:48 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Jan 23 09:00:48 localhost kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Jan 23 09:00:48 localhost kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Jan 23 09:00:48 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Jan 23 09:00:48 localhost kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Jan 23 09:00:48 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Jan 23 09:00:48 localhost kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Jan 23 09:00:48 localhost kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Jan 23 09:00:48 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Jan 23 09:00:48 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Jan 23 09:00:48 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Jan 23 09:00:48 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Jan 23 09:00:48 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Jan 23 09:00:48 localhost kernel: iommu: Default domain type: Translated
Jan 23 09:00:48 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Jan 23 09:00:48 localhost kernel: SCSI subsystem initialized
Jan 23 09:00:48 localhost kernel: ACPI: bus type USB registered
Jan 23 09:00:48 localhost kernel: usbcore: registered new interface driver usbfs
Jan 23 09:00:48 localhost kernel: usbcore: registered new interface driver hub
Jan 23 09:00:48 localhost kernel: usbcore: registered new device driver usb
Jan 23 09:00:48 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Jan 23 09:00:48 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Jan 23 09:00:48 localhost kernel: PTP clock support registered
Jan 23 09:00:48 localhost kernel: EDAC MC: Ver: 3.0.0
Jan 23 09:00:48 localhost kernel: NetLabel: Initializing
Jan 23 09:00:48 localhost kernel: NetLabel:  domain hash size = 128
Jan 23 09:00:48 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Jan 23 09:00:48 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Jan 23 09:00:48 localhost kernel: PCI: Using ACPI for IRQ routing
Jan 23 09:00:48 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Jan 23 09:00:48 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Jan 23 09:00:48 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Jan 23 09:00:48 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Jan 23 09:00:48 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Jan 23 09:00:48 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Jan 23 09:00:48 localhost kernel: vgaarb: loaded
Jan 23 09:00:48 localhost kernel: clocksource: Switched to clocksource kvm-clock
Jan 23 09:00:48 localhost kernel: VFS: Disk quotas dquot_6.6.0
Jan 23 09:00:48 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Jan 23 09:00:48 localhost kernel: pnp: PnP ACPI init
Jan 23 09:00:48 localhost kernel: pnp 00:03: [dma 2]
Jan 23 09:00:48 localhost kernel: pnp: PnP ACPI: found 5 devices
Jan 23 09:00:48 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Jan 23 09:00:48 localhost kernel: NET: Registered PF_INET protocol family
Jan 23 09:00:48 localhost kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Jan 23 09:00:48 localhost kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Jan 23 09:00:48 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Jan 23 09:00:48 localhost kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Jan 23 09:00:48 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Jan 23 09:00:48 localhost kernel: TCP: Hash tables configured (established 65536 bind 65536)
Jan 23 09:00:48 localhost kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Jan 23 09:00:48 localhost kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 09:00:48 localhost kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Jan 23 09:00:48 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Jan 23 09:00:48 localhost kernel: NET: Registered PF_XDP protocol family
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Jan 23 09:00:48 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Jan 23 09:00:48 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Jan 23 09:00:48 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Jan 23 09:00:48 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 77158 usecs
Jan 23 09:00:48 localhost kernel: PCI: CLS 0 bytes, default 64
Jan 23 09:00:48 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Jan 23 09:00:48 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Jan 23 09:00:48 localhost kernel: Trying to unpack rootfs image as initramfs...
Jan 23 09:00:48 localhost kernel: ACPI: bus type thunderbolt registered
Jan 23 09:00:48 localhost kernel: Initialise system trusted keyrings
Jan 23 09:00:48 localhost kernel: Key type blacklist registered
Jan 23 09:00:48 localhost kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Jan 23 09:00:48 localhost kernel: zbud: loaded
Jan 23 09:00:48 localhost kernel: integrity: Platform Keyring initialized
Jan 23 09:00:48 localhost kernel: integrity: Machine keyring initialized
Jan 23 09:00:48 localhost kernel: Freeing initrd memory: 87956K
Jan 23 09:00:48 localhost kernel: NET: Registered PF_ALG protocol family
Jan 23 09:00:48 localhost kernel: xor: automatically using best checksumming function   avx       
Jan 23 09:00:48 localhost kernel: Key type asymmetric registered
Jan 23 09:00:48 localhost kernel: Asymmetric key parser 'x509' registered
Jan 23 09:00:48 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Jan 23 09:00:48 localhost kernel: io scheduler mq-deadline registered
Jan 23 09:00:48 localhost kernel: io scheduler kyber registered
Jan 23 09:00:48 localhost kernel: io scheduler bfq registered
Jan 23 09:00:48 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Jan 23 09:00:48 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Jan 23 09:00:48 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Jan 23 09:00:48 localhost kernel: ACPI: button: Power Button [PWRF]
Jan 23 09:00:48 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Jan 23 09:00:48 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Jan 23 09:00:48 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Jan 23 09:00:48 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Jan 23 09:00:48 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Jan 23 09:00:48 localhost kernel: Non-volatile memory driver v1.3
Jan 23 09:00:48 localhost kernel: rdac: device handler registered
Jan 23 09:00:48 localhost kernel: hp_sw: device handler registered
Jan 23 09:00:48 localhost kernel: emc: device handler registered
Jan 23 09:00:48 localhost kernel: alua: device handler registered
Jan 23 09:00:48 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Jan 23 09:00:48 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Jan 23 09:00:48 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Jan 23 09:00:48 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Jan 23 09:00:48 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Jan 23 09:00:48 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Jan 23 09:00:48 localhost kernel: usb usb1: Product: UHCI Host Controller
Jan 23 09:00:48 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-661.el9.x86_64 uhci_hcd
Jan 23 09:00:48 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Jan 23 09:00:48 localhost kernel: hub 1-0:1.0: USB hub found
Jan 23 09:00:48 localhost kernel: hub 1-0:1.0: 2 ports detected
Jan 23 09:00:48 localhost kernel: usbcore: registered new interface driver usbserial_generic
Jan 23 09:00:48 localhost kernel: usbserial: USB Serial support registered for generic
Jan 23 09:00:48 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Jan 23 09:00:48 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Jan 23 09:00:48 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Jan 23 09:00:48 localhost kernel: mousedev: PS/2 mouse device common for all mice
Jan 23 09:00:48 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Jan 23 09:00:48 localhost kernel: rtc_cmos 00:04: registered as rtc0
Jan 23 09:00:48 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-01-23T09:00:47 UTC (1769158847)
Jan 23 09:00:48 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Jan 23 09:00:48 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Jan 23 09:00:48 localhost kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Jan 23 09:00:48 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Jan 23 09:00:48 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Jan 23 09:00:48 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Jan 23 09:00:48 localhost kernel: usbcore: registered new interface driver usbhid
Jan 23 09:00:48 localhost kernel: usbhid: USB HID core driver
Jan 23 09:00:48 localhost kernel: drop_monitor: Initializing network drop monitor service
Jan 23 09:00:48 localhost kernel: Initializing XFRM netlink socket
Jan 23 09:00:48 localhost kernel: NET: Registered PF_INET6 protocol family
Jan 23 09:00:48 localhost kernel: Segment Routing with IPv6
Jan 23 09:00:48 localhost kernel: NET: Registered PF_PACKET protocol family
Jan 23 09:00:48 localhost kernel: mpls_gso: MPLS GSO support
Jan 23 09:00:48 localhost kernel: IPI shorthand broadcast: enabled
Jan 23 09:00:48 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Jan 23 09:00:48 localhost kernel: AES CTR mode by8 optimization enabled
Jan 23 09:00:48 localhost kernel: sched_clock: Marking stable (2011002213, 152216626)->(2321699301, -158480462)
Jan 23 09:00:48 localhost kernel: registered taskstats version 1
Jan 23 09:00:48 localhost kernel: Loading compiled-in X.509 certificates
Jan 23 09:00:48 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 09:00:48 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Jan 23 09:00:48 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Jan 23 09:00:48 localhost kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Jan 23 09:00:48 localhost kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Jan 23 09:00:48 localhost kernel: Demotion targets for Node 0: null
Jan 23 09:00:48 localhost kernel: page_owner is disabled
Jan 23 09:00:48 localhost kernel: Key type .fscrypt registered
Jan 23 09:00:48 localhost kernel: Key type fscrypt-provisioning registered
Jan 23 09:00:48 localhost kernel: Key type big_key registered
Jan 23 09:00:48 localhost kernel: Key type encrypted registered
Jan 23 09:00:48 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Jan 23 09:00:48 localhost kernel: Loading compiled-in module X.509 certificates
Jan 23 09:00:48 localhost kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 04453f216699002fd63185eeab832de990bee6d7'
Jan 23 09:00:48 localhost kernel: ima: Allocated hash algorithm: sha256
Jan 23 09:00:48 localhost kernel: ima: No architecture policies found
Jan 23 09:00:48 localhost kernel: evm: Initialising EVM extended attributes:
Jan 23 09:00:48 localhost kernel: evm: security.selinux
Jan 23 09:00:48 localhost kernel: evm: security.SMACK64 (disabled)
Jan 23 09:00:48 localhost kernel: evm: security.SMACK64EXEC (disabled)
Jan 23 09:00:48 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Jan 23 09:00:48 localhost kernel: evm: security.SMACK64MMAP (disabled)
Jan 23 09:00:48 localhost kernel: evm: security.apparmor (disabled)
Jan 23 09:00:48 localhost kernel: evm: security.ima
Jan 23 09:00:48 localhost kernel: evm: security.capability
Jan 23 09:00:48 localhost kernel: evm: HMAC attrs: 0x1
Jan 23 09:00:48 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Jan 23 09:00:48 localhost kernel: Running certificate verification RSA selftest
Jan 23 09:00:48 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Jan 23 09:00:48 localhost kernel: Running certificate verification ECDSA selftest
Jan 23 09:00:48 localhost kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Jan 23 09:00:48 localhost kernel: clk: Disabling unused clocks
Jan 23 09:00:48 localhost kernel: Freeing unused decrypted memory: 2028K
Jan 23 09:00:48 localhost kernel: Freeing unused kernel image (initmem) memory: 4200K
Jan 23 09:00:48 localhost kernel: Write protecting the kernel read-only data: 30720k
Jan 23 09:00:48 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Jan 23 09:00:48 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Jan 23 09:00:48 localhost kernel: Run /init as init process
Jan 23 09:00:48 localhost kernel:   with arguments:
Jan 23 09:00:48 localhost kernel:     /init
Jan 23 09:00:48 localhost kernel:   with environment:
Jan 23 09:00:48 localhost kernel:     HOME=/
Jan 23 09:00:48 localhost kernel:     TERM=linux
Jan 23 09:00:48 localhost kernel:     BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64
Jan 23 09:00:48 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 09:00:48 localhost systemd[1]: Detected virtualization kvm.
Jan 23 09:00:48 localhost systemd[1]: Detected architecture x86-64.
Jan 23 09:00:48 localhost systemd[1]: Running in initrd.
Jan 23 09:00:48 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Jan 23 09:00:48 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Jan 23 09:00:48 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Jan 23 09:00:48 localhost kernel: usb 1-1: Manufacturer: QEMU
Jan 23 09:00:48 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Jan 23 09:00:48 localhost systemd[1]: No hostname configured, using default hostname.
Jan 23 09:00:48 localhost systemd[1]: Hostname set to <localhost>.
Jan 23 09:00:48 localhost systemd[1]: Initializing machine ID from VM UUID.
Jan 23 09:00:48 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Jan 23 09:00:48 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Jan 23 09:00:48 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Jan 23 09:00:48 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 09:00:48 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 23 09:00:48 localhost systemd[1]: Reached target Initrd /usr File System.
Jan 23 09:00:48 localhost systemd[1]: Reached target Local File Systems.
Jan 23 09:00:48 localhost systemd[1]: Reached target Path Units.
Jan 23 09:00:48 localhost systemd[1]: Reached target Slice Units.
Jan 23 09:00:48 localhost systemd[1]: Reached target Swaps.
Jan 23 09:00:48 localhost systemd[1]: Reached target Timer Units.
Jan 23 09:00:48 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 09:00:48 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Jan 23 09:00:48 localhost systemd[1]: Listening on Journal Socket.
Jan 23 09:00:48 localhost systemd[1]: Listening on udev Control Socket.
Jan 23 09:00:48 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 23 09:00:48 localhost systemd[1]: Reached target Socket Units.
Jan 23 09:00:48 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 23 09:00:48 localhost systemd[1]: Starting Journal Service...
Jan 23 09:00:48 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 09:00:48 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 23 09:00:48 localhost systemd[1]: Starting Create System Users...
Jan 23 09:00:48 localhost systemd[1]: Starting Setup Virtual Console...
Jan 23 09:00:48 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 23 09:00:48 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 23 09:00:48 localhost systemd[1]: Finished Create System Users.
Jan 23 09:00:48 localhost systemd-journald[307]: Journal started
Jan 23 09:00:48 localhost systemd-journald[307]: Runtime Journal (/run/log/journal/53821a391f4a4bf2b036ba3044ea8780) is 8.0M, max 153.6M, 145.6M free.
Jan 23 09:00:48 localhost systemd-sysusers[312]: Creating group 'users' with GID 100.
Jan 23 09:00:48 localhost systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Jan 23 09:00:48 localhost systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Jan 23 09:00:48 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 09:00:48 localhost systemd[1]: Started Journal Service.
Jan 23 09:00:48 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 09:00:48 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 09:00:48 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 09:00:48 localhost systemd[1]: Finished Setup Virtual Console.
Jan 23 09:00:48 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Jan 23 09:00:48 localhost systemd[1]: Starting dracut cmdline hook...
Jan 23 09:00:48 localhost dracut-cmdline[327]: dracut-9 dracut-057-102.git20250818.el9
Jan 23 09:00:48 localhost dracut-cmdline[327]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-661.el9.x86_64 root=UUID=22ac9141-3960-4912-b20e-19fc8a328d40 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Jan 23 09:00:48 localhost systemd[1]: Finished dracut cmdline hook.
Jan 23 09:00:48 localhost systemd[1]: Starting dracut pre-udev hook...
Jan 23 09:00:48 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Jan 23 09:00:48 localhost kernel: device-mapper: uevent: version 1.0.3
Jan 23 09:00:48 localhost kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Jan 23 09:00:48 localhost kernel: RPC: Registered named UNIX socket transport module.
Jan 23 09:00:48 localhost kernel: RPC: Registered udp transport module.
Jan 23 09:00:48 localhost kernel: RPC: Registered tcp transport module.
Jan 23 09:00:48 localhost kernel: RPC: Registered tcp-with-tls transport module.
Jan 23 09:00:48 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Jan 23 09:00:48 localhost rpc.statd[443]: Version 2.5.4 starting
Jan 23 09:00:48 localhost rpc.statd[443]: Initializing NSM state
Jan 23 09:00:48 localhost rpc.idmapd[448]: Setting log level to 0
Jan 23 09:00:48 localhost systemd[1]: Finished dracut pre-udev hook.
Jan 23 09:00:48 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 09:00:48 localhost systemd-udevd[461]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 09:00:48 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 09:00:48 localhost systemd[1]: Starting dracut pre-trigger hook...
Jan 23 09:00:48 localhost systemd[1]: Finished dracut pre-trigger hook.
Jan 23 09:00:48 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 23 09:00:48 localhost systemd[1]: Created slice Slice /system/modprobe.
Jan 23 09:00:49 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 23 09:00:49 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 23 09:00:49 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 09:00:49 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 23 09:00:49 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 09:00:49 localhost systemd[1]: Reached target Network.
Jan 23 09:00:49 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Jan 23 09:00:49 localhost systemd[1]: Starting dracut initqueue hook...
Jan 23 09:00:49 localhost kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Jan 23 09:00:49 localhost kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Jan 23 09:00:49 localhost kernel:  vda: vda1
Jan 23 09:00:49 localhost systemd[1]: Mounting Kernel Configuration File System...
Jan 23 09:00:49 localhost systemd[1]: Mounted Kernel Configuration File System.
Jan 23 09:00:49 localhost systemd[1]: Reached target System Initialization.
Jan 23 09:00:49 localhost systemd[1]: Reached target Basic System.
Jan 23 09:00:49 localhost kernel: libata version 3.00 loaded.
Jan 23 09:00:49 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Jan 23 09:00:49 localhost systemd-udevd[492]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:00:49 localhost kernel: scsi host0: ata_piix
Jan 23 09:00:49 localhost kernel: scsi host1: ata_piix
Jan 23 09:00:49 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Jan 23 09:00:49 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Jan 23 09:00:49 localhost systemd[1]: Found device /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 09:00:49 localhost systemd[1]: Reached target Initrd Root Device.
Jan 23 09:00:49 localhost kernel: ata1: found unknown device (class 0)
Jan 23 09:00:49 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Jan 23 09:00:49 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Jan 23 09:00:49 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Jan 23 09:00:49 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Jan 23 09:00:49 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Jan 23 09:00:49 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Jan 23 09:00:49 localhost systemd[1]: Finished dracut initqueue hook.
Jan 23 09:00:49 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 09:00:49 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Jan 23 09:00:49 localhost systemd[1]: Reached target Remote File Systems.
Jan 23 09:00:49 localhost systemd[1]: Starting dracut pre-mount hook...
Jan 23 09:00:49 localhost systemd[1]: Finished dracut pre-mount hook.
Jan 23 09:00:49 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40...
Jan 23 09:00:49 localhost systemd-fsck[555]: /usr/sbin/fsck.xfs: XFS file system.
Jan 23 09:00:49 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40.
Jan 23 09:00:49 localhost systemd[1]: Mounting /sysroot...
Jan 23 09:00:49 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Jan 23 09:00:49 localhost kernel: XFS (vda1): Mounting V5 Filesystem 22ac9141-3960-4912-b20e-19fc8a328d40
Jan 23 09:00:50 localhost kernel: XFS (vda1): Ending clean mount
Jan 23 09:00:50 localhost systemd[1]: Mounted /sysroot.
Jan 23 09:00:50 localhost systemd[1]: Reached target Initrd Root File System.
Jan 23 09:00:50 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Jan 23 09:00:50 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Jan 23 09:00:50 localhost systemd[1]: Reached target Initrd File Systems.
Jan 23 09:00:50 localhost systemd[1]: Reached target Initrd Default Target.
Jan 23 09:00:50 localhost systemd[1]: Starting dracut mount hook...
Jan 23 09:00:50 localhost systemd[1]: Finished dracut mount hook.
Jan 23 09:00:50 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Jan 23 09:00:50 localhost rpc.idmapd[448]: exiting on signal 15
Jan 23 09:00:50 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Jan 23 09:00:50 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Jan 23 09:00:50 localhost systemd[1]: Stopped target Network.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Timer Units.
Jan 23 09:00:50 localhost systemd[1]: dbus.socket: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Jan 23 09:00:50 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Initrd Default Target.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Basic System.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Initrd Root Device.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Initrd /usr File System.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Path Units.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Remote File Systems.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Slice Units.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Socket Units.
Jan 23 09:00:50 localhost systemd[1]: Stopped target System Initialization.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Local File Systems.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Swaps.
Jan 23 09:00:50 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped dracut mount hook.
Jan 23 09:00:50 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped dracut pre-mount hook.
Jan 23 09:00:50 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Jan 23 09:00:50 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Jan 23 09:00:50 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped dracut initqueue hook.
Jan 23 09:00:50 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped Apply Kernel Variables.
Jan 23 09:00:50 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Jan 23 09:00:50 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped Coldplug All udev Devices.
Jan 23 09:00:50 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped dracut pre-trigger hook.
Jan 23 09:00:50 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Jan 23 09:00:50 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped Setup Virtual Console.
Jan 23 09:00:50 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Jan 23 09:00:50 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Jan 23 09:00:50 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Closed udev Control Socket.
Jan 23 09:00:50 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Closed udev Kernel Socket.
Jan 23 09:00:50 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped dracut pre-udev hook.
Jan 23 09:00:50 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped dracut cmdline hook.
Jan 23 09:00:50 localhost systemd[1]: Starting Cleanup udev Database...
Jan 23 09:00:50 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Jan 23 09:00:50 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Jan 23 09:00:50 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Stopped Create System Users.
Jan 23 09:00:50 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Jan 23 09:00:50 localhost systemd[1]: Finished Cleanup udev Database.
Jan 23 09:00:50 localhost systemd[1]: Reached target Switch Root.
Jan 23 09:00:50 localhost systemd[1]: Starting Switch Root...
Jan 23 09:00:50 localhost systemd[1]: Switching root.
Jan 23 09:00:50 localhost systemd-journald[307]: Journal stopped
Jan 23 09:00:51 localhost systemd-journald[307]: Received SIGTERM from PID 1 (systemd).
Jan 23 09:00:51 localhost kernel: audit: type=1404 audit(1769158850.837:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Jan 23 09:00:51 localhost kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:00:51 localhost kernel: SELinux:  policy capability open_perms=1
Jan 23 09:00:51 localhost kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:00:51 localhost kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:00:51 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:00:51 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:00:51 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:00:51 localhost kernel: audit: type=1403 audit(1769158850.970:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Jan 23 09:00:51 localhost systemd[1]: Successfully loaded SELinux policy in 137.140ms.
Jan 23 09:00:51 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 30.224ms.
Jan 23 09:00:51 localhost systemd[1]: systemd 252-64.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Jan 23 09:00:51 localhost systemd[1]: Detected virtualization kvm.
Jan 23 09:00:51 localhost systemd[1]: Detected architecture x86-64.
Jan 23 09:00:51 localhost systemd-rc-local-generator[636]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:00:51 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped Switch Root.
Jan 23 09:00:51 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Jan 23 09:00:51 localhost systemd[1]: Created slice Slice /system/getty.
Jan 23 09:00:51 localhost systemd[1]: Created slice Slice /system/serial-getty.
Jan 23 09:00:51 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Jan 23 09:00:51 localhost systemd[1]: Created slice User and Session Slice.
Jan 23 09:00:51 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Jan 23 09:00:51 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Jan 23 09:00:51 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Jan 23 09:00:51 localhost systemd[1]: Reached target Local Encrypted Volumes.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Switch Root.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Initrd File Systems.
Jan 23 09:00:51 localhost systemd[1]: Stopped target Initrd Root File System.
Jan 23 09:00:51 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Jan 23 09:00:51 localhost systemd[1]: Reached target Path Units.
Jan 23 09:00:51 localhost systemd[1]: Reached target rpc_pipefs.target.
Jan 23 09:00:51 localhost systemd[1]: Reached target Slice Units.
Jan 23 09:00:51 localhost systemd[1]: Reached target Swaps.
Jan 23 09:00:51 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Jan 23 09:00:51 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Jan 23 09:00:51 localhost systemd[1]: Reached target RPC Port Mapper.
Jan 23 09:00:51 localhost systemd[1]: Listening on Process Core Dump Socket.
Jan 23 09:00:51 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Jan 23 09:00:51 localhost systemd[1]: Listening on udev Control Socket.
Jan 23 09:00:51 localhost systemd[1]: Listening on udev Kernel Socket.
Jan 23 09:00:51 localhost systemd[1]: Mounting Huge Pages File System...
Jan 23 09:00:51 localhost systemd[1]: Mounting POSIX Message Queue File System...
Jan 23 09:00:51 localhost systemd[1]: Mounting Kernel Debug File System...
Jan 23 09:00:51 localhost systemd[1]: Mounting Kernel Trace File System...
Jan 23 09:00:51 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 09:00:51 localhost systemd[1]: Starting Create List of Static Device Nodes...
Jan 23 09:00:51 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 23 09:00:51 localhost systemd[1]: Starting Load Kernel Module drm...
Jan 23 09:00:51 localhost systemd[1]: Starting Load Kernel Module efi_pstore...
Jan 23 09:00:51 localhost systemd[1]: Starting Load Kernel Module fuse...
Jan 23 09:00:51 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Jan 23 09:00:51 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Stopped File System Check on Root Device.
Jan 23 09:00:51 localhost systemd[1]: Stopped Journal Service.
Jan 23 09:00:51 localhost systemd[1]: Starting Journal Service...
Jan 23 09:00:51 localhost systemd[1]: Load Kernel Modules was skipped because no trigger condition checks were met.
Jan 23 09:00:51 localhost kernel: fuse: init (API version 7.37)
Jan 23 09:00:51 localhost systemd[1]: Starting Generate network units from Kernel command line...
Jan 23 09:00:51 localhost systemd[1]: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 09:00:51 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Jan 23 09:00:51 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Jan 23 09:00:51 localhost systemd[1]: Starting Apply Kernel Variables...
Jan 23 09:00:51 localhost systemd[1]: Starting Coldplug All udev Devices...
Jan 23 09:00:51 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Jan 23 09:00:51 localhost systemd[1]: Mounted Huge Pages File System.
Jan 23 09:00:51 localhost systemd[1]: Mounted POSIX Message Queue File System.
Jan 23 09:00:51 localhost systemd[1]: Mounted Kernel Debug File System.
Jan 23 09:00:51 localhost systemd[1]: Mounted Kernel Trace File System.
Jan 23 09:00:51 localhost systemd[1]: Finished Create List of Static Device Nodes.
Jan 23 09:00:51 localhost systemd-journald[677]: Journal started
Jan 23 09:00:51 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 09:00:51 localhost systemd[1]: Queued start job for default target Multi-User System.
Jan 23 09:00:51 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Started Journal Service.
Jan 23 09:00:51 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 23 09:00:51 localhost systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Finished Load Kernel Module efi_pstore.
Jan 23 09:00:51 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Finished Load Kernel Module fuse.
Jan 23 09:00:51 localhost kernel: ACPI: bus type drm_connector registered
Jan 23 09:00:51 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Jan 23 09:00:51 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Jan 23 09:00:51 localhost systemd[1]: Finished Load Kernel Module drm.
Jan 23 09:00:51 localhost systemd[1]: Finished Generate network units from Kernel command line.
Jan 23 09:00:51 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Jan 23 09:00:51 localhost systemd[1]: Finished Apply Kernel Variables.
Jan 23 09:00:51 localhost systemd[1]: Mounting FUSE Control File System...
Jan 23 09:00:51 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 09:00:51 localhost systemd[1]: Starting Rebuild Hardware Database...
Jan 23 09:00:51 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Jan 23 09:00:51 localhost systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Jan 23 09:00:51 localhost systemd[1]: Starting Load/Save OS Random Seed...
Jan 23 09:00:51 localhost systemd[1]: Starting Create System Users...
Jan 23 09:00:51 localhost systemd[1]: Mounted FUSE Control File System.
Jan 23 09:00:51 localhost systemd-journald[677]: Runtime Journal (/run/log/journal/85ac68c10a6e7ae08ceb898dbdca0cb5) is 8.0M, max 153.6M, 145.6M free.
Jan 23 09:00:51 localhost systemd-journald[677]: Received client request to flush runtime journal.
Jan 23 09:00:51 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Jan 23 09:00:51 localhost systemd[1]: Finished Coldplug All udev Devices.
Jan 23 09:00:51 localhost systemd[1]: Finished Load/Save OS Random Seed.
Jan 23 09:00:51 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Jan 23 09:00:51 localhost systemd[1]: Finished Create System Users.
Jan 23 09:00:51 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Jan 23 09:00:51 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Jan 23 09:00:51 localhost systemd[1]: Reached target Preparation for Local File Systems.
Jan 23 09:00:51 localhost systemd[1]: Reached target Local File Systems.
Jan 23 09:00:51 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Jan 23 09:00:51 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Jan 23 09:00:51 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Jan 23 09:00:51 localhost systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Jan 23 09:00:51 localhost systemd[1]: Starting Automatic Boot Loader Update...
Jan 23 09:00:51 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Jan 23 09:00:51 localhost systemd[1]: Starting Create Volatile Files and Directories...
Jan 23 09:00:51 localhost bootctl[695]: Couldn't find EFI system partition, skipping.
Jan 23 09:00:51 localhost systemd[1]: Finished Automatic Boot Loader Update.
Jan 23 09:00:51 localhost systemd[1]: Finished Create Volatile Files and Directories.
Jan 23 09:00:51 localhost systemd[1]: Starting Security Auditing Service...
Jan 23 09:00:51 localhost systemd[1]: Starting RPC Bind...
Jan 23 09:00:51 localhost auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Jan 23 09:00:51 localhost auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Jan 23 09:00:51 localhost systemd[1]: Starting Rebuild Journal Catalog...
Jan 23 09:00:51 localhost systemd[1]: Finished Rebuild Journal Catalog.
Jan 23 09:00:51 localhost systemd[1]: Started RPC Bind.
Jan 23 09:00:51 localhost augenrules[706]: /sbin/augenrules: No change
Jan 23 09:00:51 localhost augenrules[721]: No rules
Jan 23 09:00:51 localhost augenrules[721]: enabled 1
Jan 23 09:00:51 localhost augenrules[721]: failure 1
Jan 23 09:00:51 localhost augenrules[721]: pid 701
Jan 23 09:00:51 localhost augenrules[721]: rate_limit 0
Jan 23 09:00:51 localhost augenrules[721]: backlog_limit 8192
Jan 23 09:00:51 localhost augenrules[721]: lost 0
Jan 23 09:00:51 localhost augenrules[721]: backlog 1
Jan 23 09:00:51 localhost augenrules[721]: backlog_wait_time 60000
Jan 23 09:00:51 localhost augenrules[721]: backlog_wait_time_actual 0
Jan 23 09:00:51 localhost augenrules[721]: enabled 1
Jan 23 09:00:51 localhost augenrules[721]: failure 1
Jan 23 09:00:51 localhost augenrules[721]: pid 701
Jan 23 09:00:51 localhost augenrules[721]: rate_limit 0
Jan 23 09:00:51 localhost augenrules[721]: backlog_limit 8192
Jan 23 09:00:51 localhost augenrules[721]: lost 0
Jan 23 09:00:51 localhost augenrules[721]: backlog 0
Jan 23 09:00:51 localhost augenrules[721]: backlog_wait_time 60000
Jan 23 09:00:51 localhost augenrules[721]: backlog_wait_time_actual 0
Jan 23 09:00:51 localhost augenrules[721]: enabled 1
Jan 23 09:00:51 localhost augenrules[721]: failure 1
Jan 23 09:00:51 localhost augenrules[721]: pid 701
Jan 23 09:00:51 localhost augenrules[721]: rate_limit 0
Jan 23 09:00:51 localhost augenrules[721]: backlog_limit 8192
Jan 23 09:00:51 localhost augenrules[721]: lost 0
Jan 23 09:00:51 localhost augenrules[721]: backlog 0
Jan 23 09:00:51 localhost augenrules[721]: backlog_wait_time 60000
Jan 23 09:00:51 localhost augenrules[721]: backlog_wait_time_actual 0
Jan 23 09:00:51 localhost systemd[1]: Started Security Auditing Service.
Jan 23 09:00:51 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Jan 23 09:00:51 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Jan 23 09:00:52 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Jan 23 09:00:52 localhost systemd[1]: Finished Rebuild Hardware Database.
Jan 23 09:00:52 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Jan 23 09:00:52 localhost systemd[1]: Starting Update is Completed...
Jan 23 09:00:52 localhost systemd[1]: Finished Update is Completed.
Jan 23 09:00:52 localhost systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Jan 23 09:00:52 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Jan 23 09:00:52 localhost systemd[1]: Reached target System Initialization.
Jan 23 09:00:52 localhost systemd[1]: Started dnf makecache --timer.
Jan 23 09:00:52 localhost systemd[1]: Started Daily rotation of log files.
Jan 23 09:00:52 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Jan 23 09:00:52 localhost systemd[1]: Reached target Timer Units.
Jan 23 09:00:52 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Jan 23 09:00:52 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Jan 23 09:00:52 localhost systemd[1]: Reached target Socket Units.
Jan 23 09:00:52 localhost systemd[1]: Starting D-Bus System Message Bus...
Jan 23 09:00:52 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 09:00:52 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Jan 23 09:00:52 localhost systemd[1]: Starting Load Kernel Module configfs...
Jan 23 09:00:52 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Jan 23 09:00:52 localhost systemd[1]: Finished Load Kernel Module configfs.
Jan 23 09:00:52 localhost systemd-udevd[738]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:00:52 localhost systemd[1]: Started D-Bus System Message Bus.
Jan 23 09:00:52 localhost systemd[1]: Reached target Basic System.
Jan 23 09:00:52 localhost systemd[1]: Starting NTP client/server...
Jan 23 09:00:52 localhost dbus-broker-lau[753]: Ready
Jan 23 09:00:52 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Jan 23 09:00:52 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Jan 23 09:00:52 localhost kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Jan 23 09:00:52 localhost kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Jan 23 09:00:52 localhost chronyd[781]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 09:00:52 localhost chronyd[781]: Loaded 0 symmetric keys
Jan 23 09:00:52 localhost chronyd[781]: Using right/UTC timezone to obtain leap second data
Jan 23 09:00:52 localhost chronyd[781]: Loaded seccomp filter (level 2)
Jan 23 09:00:52 localhost systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Jan 23 09:00:52 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Jan 23 09:00:52 localhost systemd[1]: Starting IPv4 firewall with iptables...
Jan 23 09:00:52 localhost systemd[1]: Started irqbalance daemon.
Jan 23 09:00:52 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Jan 23 09:00:52 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 09:00:52 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 09:00:52 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 09:00:52 localhost systemd[1]: Reached target sshd-keygen.target.
Jan 23 09:00:52 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Jan 23 09:00:52 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Jan 23 09:00:52 localhost kernel: kvm_amd: TSC scaling supported
Jan 23 09:00:52 localhost kernel: kvm_amd: Nested Virtualization enabled
Jan 23 09:00:52 localhost kernel: kvm_amd: Nested Paging enabled
Jan 23 09:00:52 localhost kernel: kvm_amd: LBR virtualization supported
Jan 23 09:00:52 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Jan 23 09:00:52 localhost systemd[1]: Reached target User and Group Name Lookups.
Jan 23 09:00:52 localhost kernel: Console: switching to colour dummy device 80x25
Jan 23 09:00:52 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Jan 23 09:00:52 localhost kernel: [drm] features: -context_init
Jan 23 09:00:52 localhost kernel: [drm] number of scanouts: 1
Jan 23 09:00:52 localhost kernel: [drm] number of cap sets: 0
Jan 23 09:00:52 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Jan 23 09:00:52 localhost kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Jan 23 09:00:52 localhost kernel: Console: switching to colour frame buffer device 128x48
Jan 23 09:00:52 localhost kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Jan 23 09:00:52 localhost kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Jan 23 09:00:52 localhost kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Jan 23 09:00:52 localhost systemd[1]: Starting User Login Management...
Jan 23 09:00:52 localhost systemd[1]: Started NTP client/server.
Jan 23 09:00:52 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Jan 23 09:00:53 localhost systemd-logind[807]: New seat seat0.
Jan 23 09:00:53 localhost systemd-logind[807]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 09:00:53 localhost systemd-logind[807]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 09:00:53 localhost systemd[1]: Started User Login Management.
Jan 23 09:00:53 localhost iptables.init[789]: iptables: Applying firewall rules: [  OK  ]
Jan 23 09:00:53 localhost systemd[1]: Finished IPv4 firewall with iptables.
Jan 23 09:00:53 localhost cloud-init[838]: Cloud-init v. 24.4-8.el9 running 'init-local' at Fri, 23 Jan 2026 09:00:53 +0000. Up 7.99 seconds.
Jan 23 09:00:53 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Jan 23 09:00:53 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Jan 23 09:00:53 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpl3lm0fky.mount: Deactivated successfully.
Jan 23 09:00:53 localhost systemd[1]: Starting Hostname Service...
Jan 23 09:00:53 localhost systemd[1]: Started Hostname Service.
Jan 23 09:00:53 np0005593294.novalocal systemd-hostnamed[852]: Hostname set to <np0005593294.novalocal> (static)
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Reached target Preparation for Network.
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Starting Network Manager...
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.3524] NetworkManager (version 1.54.3-2.el9) is starting... (boot:0ec3f185-e60c-43ea-a74e-c21caf2508ae)
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.3529] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.3612] manager[0x561d9de58000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.3649] hostname: hostname: using hostnamed
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.3649] hostname: static hostname changed from (none) to "np0005593294.novalocal"
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.3657] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.3815] manager[0x561d9de58000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.3816] manager[0x561d9de58000]: rfkill: WWAN hardware radio set enabled
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4041] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4042] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4047] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4048] manager: Networking is enabled by state file
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4052] settings: Loaded settings plugin: keyfile (internal)
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4067] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4101] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4119] dhcp: init: Using DHCP client 'internal'
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4123] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4145] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4157] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4171] device (lo): Activation: starting connection 'lo' (6a1055b1-2674-4e8e-9fff-1fce9dcc1052)
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4187] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4192] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4233] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4241] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4244] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4247] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4251] device (eth0): carrier: link connected
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4256] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4273] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Started Network Manager.
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4281] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4288] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4289] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4293] manager: NetworkManager state is now CONNECTING
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4295] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Reached target Network.
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4306] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4309] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4350] dhcp4 (eth0): state changed new lease, address=38.129.56.30
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4359] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4379] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4492] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4495] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4501] device (lo): Activation: successful, device activated.
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4521] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4523] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4527] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4529] device (eth0): Activation: successful, device activated.
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4534] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 09:00:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769158854.4537] manager: startup complete
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Reached target NFS client services.
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Reached target Remote File Systems.
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 23 09:00:54 np0005593294.novalocal systemd[1]: Starting Cloud-init: Network Stage...
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: Cloud-init v. 24.4-8.el9 running 'init' at Fri, 23 Jan 2026 09:00:54 +0000. Up 9.23 seconds.
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: |  eth0  | True |         38.129.56.30         | 255.255.255.0 | global | fa:16:3e:02:24:b4 |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: |  eth0  | True | fe80::f816:3eff:fe02:24b4/64 |       .       |  link  | fa:16:3e:02:24:b4 |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: ++++++++++++++++++++++++++++++++Route IPv4 info++++++++++++++++++++++++++++++++
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: | Route |   Destination   |   Gateway   |     Genmask     | Interface | Flags |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: |   0   |     0.0.0.0     | 38.129.56.1 |     0.0.0.0     |    eth0   |   UG  |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: |   1   |   38.129.56.0   |   0.0.0.0   |  255.255.255.0  |    eth0   |   U   |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: |   2   | 169.254.169.254 | 38.129.56.5 | 255.255.255.255 |    eth0   |  UGH  |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +-------+-----------------+-------------+-----------------+-----------+-------+
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Jan 23 09:00:54 np0005593294.novalocal cloud-init[920]: ci-info: +-------+-------------+---------+-----------+-------+
Jan 23 09:00:57 np0005593294.novalocal useradd[989]: new group: name=cloud-user, GID=1001
Jan 23 09:00:57 np0005593294.novalocal useradd[989]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Jan 23 09:00:57 np0005593294.novalocal useradd[989]: add 'cloud-user' to group 'adm'
Jan 23 09:00:57 np0005593294.novalocal useradd[989]: add 'cloud-user' to group 'systemd-journal'
Jan 23 09:00:57 np0005593294.novalocal useradd[989]: add 'cloud-user' to shadow group 'adm'
Jan 23 09:00:57 np0005593294.novalocal useradd[989]: add 'cloud-user' to shadow group 'systemd-journal'
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: Generating public/private rsa key pair.
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: The key fingerprint is:
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: SHA256:edOWuOchfDQbnTRH8Bc/YcFEmOHQYCN+DGWnL80pn3E root@np0005593294.novalocal
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: The key's randomart image is:
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: +---[RSA 3072]----+
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |         o.BooOOo|
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |        . * *+.++|
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |         . + .o.=|
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |         ..o++.+o|
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |        S +oB*oE |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |         o =+++  |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |          + =o   |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |           = .   |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |            .    |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: +----[SHA256]-----+
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: Generating public/private ecdsa key pair.
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: The key fingerprint is:
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: SHA256:o0A+QcrZ0hR2kiK37DAHwDSF5p7eUg06J2fwyvGJpUo root@np0005593294.novalocal
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: The key's randomart image is:
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: +---[ECDSA 256]---+
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |=oo.*o.          |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |o=+O.o           |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |o==o=            |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |oo+= .           |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |.== *   S        |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: | B.B + . .       |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |oE/ . .          |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |.B +             |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |o .              |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: +----[SHA256]-----+
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: Generating public/private ed25519 key pair.
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: The key fingerprint is:
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: SHA256:yXgWegZQ6Icfku55tlvbeBAKsmPA73B26hK/CNQHO40 root@np0005593294.novalocal
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: The key's randomart image is:
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: +--[ED25519 256]--+
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |    .o.          |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |    ..           |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |.  o o. .        |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |..o @ o=.o       |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: | o.E BooS.       |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |.o+++.o=.        |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |..Boo.  ..       |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: |...+o o. +.      |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: | .oo.oooo..      |
Jan 23 09:00:58 np0005593294.novalocal cloud-init[920]: +----[SHA256]-----+
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Finished Cloud-init: Network Stage.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Reached target Cloud-config availability.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Reached target Network is Online.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Starting Cloud-init: Config Stage...
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Starting Crash recovery kernel arming...
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Starting System Logging Service...
Jan 23 09:00:58 np0005593294.novalocal sm-notify[1005]: Version 2.5.4 starting
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Starting OpenSSH server daemon...
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Starting Permit User Sessions...
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Started Notify NFS peers of a restart.
Jan 23 09:00:58 np0005593294.novalocal sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 23 09:00:58 np0005593294.novalocal sshd[1007]: Server listening on :: port 22.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Started OpenSSH server daemon.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Finished Permit User Sessions.
Jan 23 09:00:58 np0005593294.novalocal rsyslogd[1006]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1006" x-info="https://www.rsyslog.com"] start
Jan 23 09:00:58 np0005593294.novalocal rsyslogd[1006]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Started Command Scheduler.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Started Getty on tty1.
Jan 23 09:00:58 np0005593294.novalocal crond[1010]: (CRON) STARTUP (1.5.7)
Jan 23 09:00:58 np0005593294.novalocal crond[1010]: (CRON) INFO (Syslog will be used instead of sendmail.)
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Started Serial Getty on ttyS0.
Jan 23 09:00:58 np0005593294.novalocal crond[1010]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 61% if used.)
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Reached target Login Prompts.
Jan 23 09:00:58 np0005593294.novalocal crond[1010]: (CRON) INFO (running with inotify support)
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Started System Logging Service.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Reached target Multi-User System.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Jan 23 09:00:58 np0005593294.novalocal rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:00:58 np0005593294.novalocal kdumpctl[1019]: kdump: No kdump initial ramdisk found.
Jan 23 09:00:58 np0005593294.novalocal kdumpctl[1019]: kdump: Rebuilding /boot/initramfs-5.14.0-661.el9.x86_64kdump.img
Jan 23 09:00:58 np0005593294.novalocal sshd-session[1077]: Connection reset by 38.102.83.114 port 58608 [preauth]
Jan 23 09:00:58 np0005593294.novalocal sshd-session[1098]: Unable to negotiate with 38.102.83.114 port 49522: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Jan 23 09:00:58 np0005593294.novalocal sshd-session[1121]: Unable to negotiate with 38.102.83.114 port 49534: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Jan 23 09:00:58 np0005593294.novalocal sshd-session[1134]: Unable to negotiate with 38.102.83.114 port 49538: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1152]: Cloud-init v. 24.4-8.el9 running 'modules:config' at Fri, 23 Jan 2026 09:00:58 +0000. Up 12.85 seconds.
Jan 23 09:00:58 np0005593294.novalocal sshd-session[1151]: Connection reset by 38.102.83.114 port 49556 [preauth]
Jan 23 09:00:58 np0005593294.novalocal sshd-session[1156]: Unable to negotiate with 38.102.83.114 port 49564: no matching host key type found. Their offer: ssh-rsa,ssh-rsa-cert-v01@openssh.com [preauth]
Jan 23 09:00:58 np0005593294.novalocal sshd-session[1164]: Unable to negotiate with 38.102.83.114 port 49580: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Jan 23 09:00:58 np0005593294.novalocal sshd-session[1107]: Connection closed by 38.102.83.114 port 49524 [preauth]
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Finished Cloud-init: Config Stage.
Jan 23 09:00:58 np0005593294.novalocal systemd[1]: Starting Cloud-init: Final Stage...
Jan 23 09:00:58 np0005593294.novalocal sshd-session[1144]: Connection closed by 38.102.83.114 port 49544 [preauth]
Jan 23 09:00:58 np0005593294.novalocal dracut[1285]: dracut-057-102.git20250818.el9
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1303]: Cloud-init v. 24.4-8.el9 running 'modules:final' at Fri, 23 Jan 2026 09:00:58 +0000. Up 13.23 seconds.
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1314]: #############################################################
Jan 23 09:00:58 np0005593294.novalocal dracut[1287]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/22ac9141-3960-4912-b20e-19fc8a328d40 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-661.el9.x86_64kdump.img 5.14.0-661.el9.x86_64
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1316]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1323]: 256 SHA256:o0A+QcrZ0hR2kiK37DAHwDSF5p7eUg06J2fwyvGJpUo root@np0005593294.novalocal (ECDSA)
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1328]: 256 SHA256:yXgWegZQ6Icfku55tlvbeBAKsmPA73B26hK/CNQHO40 root@np0005593294.novalocal (ED25519)
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1334]: 3072 SHA256:edOWuOchfDQbnTRH8Bc/YcFEmOHQYCN+DGWnL80pn3E root@np0005593294.novalocal (RSA)
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1336]: -----END SSH HOST KEY FINGERPRINTS-----
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1337]: #############################################################
Jan 23 09:00:58 np0005593294.novalocal cloud-init[1303]: Cloud-init v. 24.4-8.el9 finished at Fri, 23 Jan 2026 09:00:58 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 13.42 seconds
Jan 23 09:00:59 np0005593294.novalocal systemd[1]: Finished Cloud-init: Final Stage.
Jan 23 09:00:59 np0005593294.novalocal systemd[1]: Reached target Cloud-init target.
Jan 23 09:00:59 np0005593294.novalocal chronyd[781]: Selected source 198.181.199.82 (2.centos.pool.ntp.org)
Jan 23 09:00:59 np0005593294.novalocal chronyd[781]: System clock TAI offset set to 37 seconds
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: Module 'ifcfg' will not be installed, because it's in the list to be omitted!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: Module 'plymouth' will not be installed, because it's in the list to be omitted!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: Module 'resume' will not be installed, because it's in the list to be omitted!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: Module 'earlykdump' will not be installed, because it's in the list to be omitted!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: memstrack is not available
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Jan 23 09:00:59 np0005593294.novalocal dracut[1287]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: memstrack is not available
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: *** Including module: systemd ***
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: *** Including module: fips ***
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: *** Including module: systemd-initrd ***
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: *** Including module: i18n ***
Jan 23 09:01:00 np0005593294.novalocal dracut[1287]: *** Including module: drm ***
Jan 23 09:01:01 np0005593294.novalocal CROND[2178]: (root) CMD (run-parts /etc/cron.hourly)
Jan 23 09:01:01 np0005593294.novalocal run-parts[2186]: (/etc/cron.hourly) starting 0anacron
Jan 23 09:01:01 np0005593294.novalocal anacron[2198]: Anacron started on 2026-01-23
Jan 23 09:01:01 np0005593294.novalocal anacron[2198]: Will run job `cron.daily' in 44 min.
Jan 23 09:01:01 np0005593294.novalocal anacron[2198]: Will run job `cron.weekly' in 64 min.
Jan 23 09:01:01 np0005593294.novalocal anacron[2198]: Will run job `cron.monthly' in 84 min.
Jan 23 09:01:01 np0005593294.novalocal anacron[2198]: Jobs will be executed sequentially
Jan 23 09:01:01 np0005593294.novalocal run-parts[2202]: (/etc/cron.hourly) finished 0anacron
Jan 23 09:01:01 np0005593294.novalocal CROND[2176]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]: *** Including module: prefixdevname ***
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]: *** Including module: kernel-modules ***
Jan 23 09:01:01 np0005593294.novalocal kernel: block vda: the capability attribute has been deprecated.
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]: *** Including module: kernel-modules-extra ***
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]: *** Including module: qemu ***
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]: *** Including module: fstab-sys ***
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]: *** Including module: rootfs-block ***
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]: *** Including module: terminfo ***
Jan 23 09:01:01 np0005593294.novalocal dracut[1287]: *** Including module: udev-rules ***
Jan 23 09:01:02 np0005593294.novalocal dracut[1287]: Skipping udev rule: 91-permissions.rules
Jan 23 09:01:02 np0005593294.novalocal dracut[1287]: Skipping udev rule: 80-drivers-modprobe.rules
Jan 23 09:01:02 np0005593294.novalocal dracut[1287]: *** Including module: virtiofs ***
Jan 23 09:01:02 np0005593294.novalocal dracut[1287]: *** Including module: dracut-systemd ***
Jan 23 09:01:02 np0005593294.novalocal dracut[1287]: *** Including module: usrmount ***
Jan 23 09:01:02 np0005593294.novalocal dracut[1287]: *** Including module: base ***
Jan 23 09:01:02 np0005593294.novalocal dracut[1287]: *** Including module: fs-lib ***
Jan 23 09:01:02 np0005593294.novalocal dracut[1287]: *** Including module: kdumpbase ***
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]: *** Including module: microcode_ctl-fw_dir_override ***
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:   microcode_ctl module: mangling fw_dir
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: Cannot change IRQ 25 affinity: Operation not permitted
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: IRQ 25 affinity is now unmanaged
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: Cannot change IRQ 31 affinity: Operation not permitted
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: IRQ 31 affinity is now unmanaged
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: Cannot change IRQ 28 affinity: Operation not permitted
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: IRQ 28 affinity is now unmanaged
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: Cannot change IRQ 32 affinity: Operation not permitted
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: IRQ 32 affinity is now unmanaged
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: Cannot change IRQ 30 affinity: Operation not permitted
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: IRQ 30 affinity is now unmanaged
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: Cannot change IRQ 29 affinity: Operation not permitted
Jan 23 09:01:03 np0005593294.novalocal irqbalance[791]: IRQ 29 affinity is now unmanaged
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: configuration "intel-06-8f-08" is ignored
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]:     microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]: *** Including module: openssl ***
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]: *** Including module: shutdown ***
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]: *** Including module: squash ***
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]: *** Including modules done ***
Jan 23 09:01:03 np0005593294.novalocal dracut[1287]: *** Installing kernel module dependencies ***
Jan 23 09:01:04 np0005593294.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:01:04 np0005593294.novalocal dracut[1287]: *** Installing kernel module dependencies done ***
Jan 23 09:01:04 np0005593294.novalocal dracut[1287]: *** Resolving executable dependencies ***
Jan 23 09:01:06 np0005593294.novalocal dracut[1287]: *** Resolving executable dependencies done ***
Jan 23 09:01:06 np0005593294.novalocal dracut[1287]: *** Generating early-microcode cpio image ***
Jan 23 09:01:06 np0005593294.novalocal dracut[1287]: *** Store current command line parameters ***
Jan 23 09:01:06 np0005593294.novalocal dracut[1287]: Stored kernel commandline:
Jan 23 09:01:06 np0005593294.novalocal dracut[1287]: No dracut internal kernel commandline stored in the initramfs
Jan 23 09:01:06 np0005593294.novalocal dracut[1287]: *** Install squash loader ***
Jan 23 09:01:07 np0005593294.novalocal dracut[1287]: *** Squashing the files inside the initramfs ***
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: *** Squashing the files inside the initramfs done ***
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: *** Creating image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' ***
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: *** Hardlinking files ***
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: Mode:           real
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: Files:          50
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: Linked:         0 files
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: Compared:       0 xattrs
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: Compared:       0 files
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: Saved:          0 B
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: Duration:       0.001079 seconds
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: *** Hardlinking files done ***
Jan 23 09:01:08 np0005593294.novalocal dracut[1287]: *** Creating initramfs image file '/boot/initramfs-5.14.0-661.el9.x86_64kdump.img' done ***
Jan 23 09:01:09 np0005593294.novalocal kdumpctl[1019]: kdump: kexec: loaded kdump kernel
Jan 23 09:01:09 np0005593294.novalocal kdumpctl[1019]: kdump: Starting kdump: [OK]
Jan 23 09:01:09 np0005593294.novalocal systemd[1]: Finished Crash recovery kernel arming.
Jan 23 09:01:09 np0005593294.novalocal systemd[1]: Startup finished in 2.342s (kernel) + 2.968s (initrd) + 18.552s (userspace) = 23.862s.
Jan 23 09:01:24 np0005593294.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 09:01:44 np0005593294.novalocal sshd-session[4321]: Accepted publickey for zuul from 38.102.83.114 port 52204 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Jan 23 09:01:44 np0005593294.novalocal systemd[1]: Created slice User Slice of UID 1000.
Jan 23 09:01:44 np0005593294.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Jan 23 09:01:44 np0005593294.novalocal systemd-logind[807]: New session 1 of user zuul.
Jan 23 09:01:44 np0005593294.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Jan 23 09:01:44 np0005593294.novalocal systemd[1]: Starting User Manager for UID 1000...
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Queued start job for default target Main User Target.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Created slice User Application Slice.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Reached target Paths.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Reached target Timers.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Starting D-Bus User Message Bus Socket...
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Starting Create User's Volatile Files and Directories...
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Reached target Sockets.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Finished Create User's Volatile Files and Directories.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Reached target Basic System.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Reached target Main User Target.
Jan 23 09:01:44 np0005593294.novalocal systemd[4325]: Startup finished in 128ms.
Jan 23 09:01:44 np0005593294.novalocal systemd[1]: Started User Manager for UID 1000.
Jan 23 09:01:44 np0005593294.novalocal systemd[1]: Started Session 1 of User zuul.
Jan 23 09:01:44 np0005593294.novalocal sshd-session[4321]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:01:45 np0005593294.novalocal python3[4407]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:48 np0005593294.novalocal python3[4435]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:55 np0005593294.novalocal python3[4493]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:01:56 np0005593294.novalocal python3[4533]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Jan 23 09:01:58 np0005593294.novalocal python3[4559]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQChWBsfs5FtlYIS47KhLNXtsYVhP6UT/w4WYq1l1d/b7+cXPAwAb4Qt1cc/BmNcKM419a6D+CvPejxC67s0h4ksuceBjB/s6b88/zjf8Lio8Dd87f6J+f6IY8ByYIQ8s3Hvn6z0K7HSyEMuQ0B/CLxeBW4MJFqcoLK2v7Y8SNPGLr8w/8y79OWnJJPKmfM4ACTo2JwqmPGI/4+LQsCZS/p/yKDTO5AYxsIUwWw/IX3Jxs67UOBqa40onmgM/VRkfGY512fziVUNkmFHG2Aqgosbpbz/XysrVTpvLRA/H2zpGbbTbuEg6xp8vHQO5V0csAd6p3cdOixjdaPmf9oy3+yXuIeWwnnxPHqvVDY6N9aaIX4vuajxOoMUFiQ2YtcDq7sCn8HoateyYgIL/u2+pInArUiYGemyMEWja0DhD6UdCkY0Ea+YDWeIZKM505N+HClR5jfjjVW35TndY+AldV5OhOzMRmPjtJYS8a0usUXRvmxRfMFSmO9CI1RfNmod9X0= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:01:59 np0005593294.novalocal python3[4583]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:01:59 np0005593294.novalocal python3[4682]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:00 np0005593294.novalocal python3[4753]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158919.4722352-252-40954728514482/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=79d1f7c5e92f4d57bb17665cf28be8d8_id_rsa follow=False checksum=70fc72f3adde7c23bd22f0e2ad4ebdd2e15c011a backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:00 np0005593294.novalocal python3[4876]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:01 np0005593294.novalocal python3[4947]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769158920.550354-307-128609600897818/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=79d1f7c5e92f4d57bb17665cf28be8d8_id_rsa.pub follow=False checksum=1817e5216c13f90f69486a375706d090e99f2d79 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:02 np0005593294.novalocal python3[4995]: ansible-ping Invoked with data=pong
Jan 23 09:02:03 np0005593294.novalocal python3[5019]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:02:05 np0005593294.novalocal python3[5077]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Jan 23 09:02:07 np0005593294.novalocal python3[5109]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:07 np0005593294.novalocal python3[5133]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:07 np0005593294.novalocal python3[5157]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:07 np0005593294.novalocal python3[5181]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:08 np0005593294.novalocal python3[5205]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:08 np0005593294.novalocal python3[5229]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:10 np0005593294.novalocal sudo[5253]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evkybulpvjvxjwcduxumbwwrkueqvcbk ; /usr/bin/python3'
Jan 23 09:02:10 np0005593294.novalocal sudo[5253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:10 np0005593294.novalocal python3[5255]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:10 np0005593294.novalocal sudo[5253]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:10 np0005593294.novalocal sudo[5331]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytofllxmyxtlbnungcmbaqxobhrvurft ; /usr/bin/python3'
Jan 23 09:02:10 np0005593294.novalocal sudo[5331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:10 np0005593294.novalocal python3[5333]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:10 np0005593294.novalocal sudo[5331]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:11 np0005593294.novalocal sudo[5404]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzornnnmjofxxgsqlsnmhhqtraapvgmo ; /usr/bin/python3'
Jan 23 09:02:11 np0005593294.novalocal sudo[5404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:11 np0005593294.novalocal python3[5406]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158930.4675694-32-101246637039073/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:11 np0005593294.novalocal sudo[5404]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:12 np0005593294.novalocal python3[5454]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:12 np0005593294.novalocal python3[5478]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:12 np0005593294.novalocal python3[5502]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:12 np0005593294.novalocal python3[5526]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:13 np0005593294.novalocal python3[5550]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:13 np0005593294.novalocal python3[5574]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:13 np0005593294.novalocal python3[5598]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:14 np0005593294.novalocal python3[5622]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:14 np0005593294.novalocal python3[5646]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:14 np0005593294.novalocal python3[5670]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:14 np0005593294.novalocal python3[5694]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:15 np0005593294.novalocal python3[5718]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:15 np0005593294.novalocal python3[5742]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:15 np0005593294.novalocal python3[5766]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:15 np0005593294.novalocal python3[5790]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:16 np0005593294.novalocal python3[5814]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:16 np0005593294.novalocal python3[5838]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:16 np0005593294.novalocal python3[5862]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:16 np0005593294.novalocal python3[5886]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:17 np0005593294.novalocal python3[5910]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:17 np0005593294.novalocal python3[5934]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:17 np0005593294.novalocal python3[5958]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:18 np0005593294.novalocal python3[5982]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:18 np0005593294.novalocal python3[6006]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:18 np0005593294.novalocal python3[6030]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:18 np0005593294.novalocal python3[6054]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:02:21 np0005593294.novalocal sudo[6078]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gjtqqjeqgfmotpphlzuhlhkepiwuewvn ; /usr/bin/python3'
Jan 23 09:02:21 np0005593294.novalocal sudo[6078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:21 np0005593294.novalocal python3[6080]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 09:02:21 np0005593294.novalocal systemd[1]: Starting Time & Date Service...
Jan 23 09:02:22 np0005593294.novalocal systemd[1]: Started Time & Date Service.
Jan 23 09:02:22 np0005593294.novalocal systemd-timedated[6082]: Changed time zone to 'UTC' (UTC).
Jan 23 09:02:22 np0005593294.novalocal sudo[6078]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:22 np0005593294.novalocal sudo[6109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mesobgvqenwojajftjtzyhkmqxppkxeb ; /usr/bin/python3'
Jan 23 09:02:22 np0005593294.novalocal sudo[6109]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:22 np0005593294.novalocal python3[6111]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:22 np0005593294.novalocal sudo[6109]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:22 np0005593294.novalocal python3[6187]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:23 np0005593294.novalocal python3[6258]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769158942.657495-252-246197202602001/source _original_basename=tmphpdpiom_ follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:23 np0005593294.novalocal python3[6358]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:24 np0005593294.novalocal python3[6429]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769158943.544987-302-124129721618078/source _original_basename=tmpwhou6zj1 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:24 np0005593294.novalocal sudo[6529]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bygdueeicrwyiflircfgadodqfitqfzt ; /usr/bin/python3'
Jan 23 09:02:24 np0005593294.novalocal sudo[6529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:24 np0005593294.novalocal python3[6531]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:24 np0005593294.novalocal sudo[6529]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:25 np0005593294.novalocal sudo[6602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imjsilawldyqrjiifvdctgxegalskitg ; /usr/bin/python3'
Jan 23 09:02:25 np0005593294.novalocal sudo[6602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:25 np0005593294.novalocal python3[6604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769158944.7444382-383-20136879228048/source _original_basename=tmpazju_cwh follow=False checksum=96d192923ef836711213a25c6ed0ba1e0702c4c3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:25 np0005593294.novalocal sudo[6602]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:25 np0005593294.novalocal python3[6652]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:02:26 np0005593294.novalocal python3[6678]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:02:27 np0005593294.novalocal sudo[6756]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzvytyzbyuvysqsjelnqfpwwmctxebfo ; /usr/bin/python3'
Jan 23 09:02:27 np0005593294.novalocal sudo[6756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:27 np0005593294.novalocal python3[6758]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:02:27 np0005593294.novalocal sudo[6756]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:27 np0005593294.novalocal sudo[6829]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-booryzrbqkfamuvvxwqnocpytpczqqyq ; /usr/bin/python3'
Jan 23 09:02:27 np0005593294.novalocal sudo[6829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:28 np0005593294.novalocal python3[6831]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769158947.3521514-452-112811626565435/source _original_basename=tmpt0kuwa9_ follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:28 np0005593294.novalocal sudo[6829]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:28 np0005593294.novalocal sudo[6880]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sveiqymzsgizpzryhxgqiomwoeadvvuh ; /usr/bin/python3'
Jan 23 09:02:28 np0005593294.novalocal sudo[6880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:28 np0005593294.novalocal python3[6882]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-639e-86bd-00000000001f-1-compute1 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:02:28 np0005593294.novalocal sudo[6880]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:29 np0005593294.novalocal python3[6910]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-639e-86bd-000000000020-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Jan 23 09:02:30 np0005593294.novalocal python3[6939]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:48 np0005593294.novalocal sudo[6963]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-klgsdniorjnuiffaedfzyxsqysufzals ; /usr/bin/python3'
Jan 23 09:02:48 np0005593294.novalocal sudo[6963]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:02:48 np0005593294.novalocal python3[6965]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:02:48 np0005593294.novalocal sudo[6963]: pam_unix(sudo:session): session closed for user root
Jan 23 09:02:52 np0005593294.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 09:03:48 np0005593294.novalocal sshd-session[4334]: Received disconnect from 38.102.83.114 port 52204:11: disconnected by user
Jan 23 09:03:48 np0005593294.novalocal sshd-session[4334]: Disconnected from user zuul 38.102.83.114 port 52204
Jan 23 09:03:48 np0005593294.novalocal sshd-session[4321]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:03:48 np0005593294.novalocal systemd-logind[807]: Session 1 logged out. Waiting for processes to exit.
Jan 23 09:03:54 np0005593294.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Jan 23 09:03:54 np0005593294.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Jan 23 09:03:54 np0005593294.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Jan 23 09:03:54 np0005593294.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Jan 23 09:03:54 np0005593294.novalocal kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Jan 23 09:03:54 np0005593294.novalocal kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Jan 23 09:03:54 np0005593294.novalocal kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Jan 23 09:03:54 np0005593294.novalocal kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Jan 23 09:03:54 np0005593294.novalocal kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Jan 23 09:03:54 np0005593294.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4082] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 09:03:54 np0005593294.novalocal systemd-udevd[6969]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4352] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4372] settings: (eth1): created default wired connection 'Wired connection 1'
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4374] device (eth1): carrier: link connected
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4376] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4380] policy: auto-activating connection 'Wired connection 1' (c9ce933b-996c-3254-bccc-8d3373d274f1)
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4383] device (eth1): Activation: starting connection 'Wired connection 1' (c9ce933b-996c-3254-bccc-8d3373d274f1)
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4384] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4386] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4389] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:03:54 np0005593294.novalocal NetworkManager[856]: <info>  [1769159034.4392] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:03:54 np0005593294.novalocal systemd[4325]: Starting Mark boot as successful...
Jan 23 09:03:54 np0005593294.novalocal systemd[4325]: Finished Mark boot as successful.
Jan 23 09:03:55 np0005593294.novalocal sshd-session[6973]: Accepted publickey for zuul from 38.102.83.114 port 43394 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:03:55 np0005593294.novalocal systemd-logind[807]: New session 3 of user zuul.
Jan 23 09:03:55 np0005593294.novalocal systemd[1]: Started Session 3 of User zuul.
Jan 23 09:03:55 np0005593294.novalocal sshd-session[6973]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:03:55 np0005593294.novalocal python3[7000]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-4543-3693-00000000018f-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:04:05 np0005593294.novalocal sudo[7078]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjowycxbuzcdqwpnkjsymzftiyilwbat ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:04:05 np0005593294.novalocal sudo[7078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:05 np0005593294.novalocal python3[7080]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:04:05 np0005593294.novalocal sudo[7078]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:05 np0005593294.novalocal sudo[7151]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izfgdvuegckelwikrkddxkfqemzncmyi ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:04:05 np0005593294.novalocal sudo[7151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:05 np0005593294.novalocal python3[7153]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769159045.3941975-155-196895230060800/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=cf8ee7cd7bc1fd6d9388d3c03a8ac5811adcc451 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:04:06 np0005593294.novalocal sudo[7151]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:06 np0005593294.novalocal sudo[7201]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twqjgfgremkmrbbxmnlajotlbwlepmtf ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:04:06 np0005593294.novalocal sudo[7201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:04:06 np0005593294.novalocal python3[7203]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Stopped Network Manager Wait Online.
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Stopping Network Manager Wait Online...
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Stopping Network Manager...
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[856]: <info>  [1769159046.5967] caught SIGTERM, shutting down normally.
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[856]: <info>  [1769159046.5980] dhcp4 (eth0): canceled DHCP transaction
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[856]: <info>  [1769159046.5981] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[856]: <info>  [1769159046.5981] dhcp4 (eth0): state changed no lease
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[856]: <info>  [1769159046.5983] manager: NetworkManager state is now CONNECTING
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[856]: <info>  [1769159046.6115] dhcp4 (eth1): canceled DHCP transaction
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[856]: <info>  [1769159046.6115] dhcp4 (eth1): state changed no lease
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[856]: <info>  [1769159046.6186] exiting (success)
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Stopped Network Manager.
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: NetworkManager.service: Consumed 1.209s CPU time, 10.2M memory peak.
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Starting Network Manager...
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.6775] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:0ec3f185-e60c-43ea-a74e-c21caf2508ae)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.6776] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.6819] manager[0x55f86ffe1000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Starting Hostname Service...
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Started Hostname Service.
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.7940] hostname: hostname: using hostnamed
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.7941] hostname: static hostname changed from (none) to "np0005593294.novalocal"
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.7949] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.7957] manager[0x55f86ffe1000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.7958] manager[0x55f86ffe1000]: rfkill: WWAN hardware radio set enabled
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8007] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8008] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8009] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8010] manager: Networking is enabled by state file
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8015] settings: Loaded settings plugin: keyfile (internal)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8021] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8062] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8078] dhcp: init: Using DHCP client 'internal'
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8082] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8090] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8098] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8111] device (lo): Activation: starting connection 'lo' (6a1055b1-2674-4e8e-9fff-1fce9dcc1052)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8121] device (eth0): carrier: link connected
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8128] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8135] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8136] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8146] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8155] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8163] device (eth1): carrier: link connected
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8170] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8178] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (c9ce933b-996c-3254-bccc-8d3373d274f1) (indicated)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8178] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8187] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8198] device (eth1): Activation: starting connection 'Wired connection 1' (c9ce933b-996c-3254-bccc-8d3373d274f1)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8206] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Started Network Manager.
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8213] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8221] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8224] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8228] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8233] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8237] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8242] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8247] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8260] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8266] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8277] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8281] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8312] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8320] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8327] device (lo): Activation: successful, device activated.
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8338] dhcp4 (eth0): state changed new lease, address=38.129.56.30
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8348] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 09:04:06 np0005593294.novalocal systemd[1]: Starting Network Manager Wait Online...
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8438] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8471] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8474] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8481] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8488] device (eth0): Activation: successful, device activated.
Jan 23 09:04:06 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159046.8499] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 09:04:06 np0005593294.novalocal sudo[7201]: pam_unix(sudo:session): session closed for user root
Jan 23 09:04:07 np0005593294.novalocal python3[7288]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-4543-3693-0000000000c8-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:04:16 np0005593294.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:04:36 np0005593294.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5256] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 09:04:52 np0005593294.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:04:52 np0005593294.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5587] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5591] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5600] device (eth1): Activation: successful, device activated.
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5608] manager: startup complete
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5610] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <warn>  [1769159092.5619] device (eth1): Activation: failed for connection 'Wired connection 1'
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5627] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Jan 23 09:04:52 np0005593294.novalocal systemd[1]: Finished Network Manager Wait Online.
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5805] dhcp4 (eth1): canceled DHCP transaction
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5806] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5806] dhcp4 (eth1): state changed no lease
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5822] policy: auto-activating connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50)
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5826] device (eth1): Activation: starting connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50)
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5827] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5831] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5838] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5849] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5893] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5896] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:04:52 np0005593294.novalocal NetworkManager[7216]: <info>  [1769159092.5903] device (eth1): Activation: successful, device activated.
Jan 23 09:05:02 np0005593294.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:05:07 np0005593294.novalocal sshd-session[6976]: Received disconnect from 38.102.83.114 port 43394:11: disconnected by user
Jan 23 09:05:07 np0005593294.novalocal sshd-session[6976]: Disconnected from user zuul 38.102.83.114 port 43394
Jan 23 09:05:07 np0005593294.novalocal sshd-session[6973]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:05:07 np0005593294.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Jan 23 09:05:07 np0005593294.novalocal systemd[1]: session-3.scope: Consumed 1.462s CPU time.
Jan 23 09:05:07 np0005593294.novalocal systemd-logind[807]: Session 3 logged out. Waiting for processes to exit.
Jan 23 09:05:07 np0005593294.novalocal systemd-logind[807]: Removed session 3.
Jan 23 09:05:53 np0005593294.novalocal sshd-session[7316]: Accepted publickey for zuul from 38.102.83.114 port 55900 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:05:53 np0005593294.novalocal systemd-logind[807]: New session 4 of user zuul.
Jan 23 09:05:53 np0005593294.novalocal systemd[1]: Started Session 4 of User zuul.
Jan 23 09:05:53 np0005593294.novalocal sshd-session[7316]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:05:53 np0005593294.novalocal sudo[7395]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgoyxfqfprruvfujtqwxljlxmqkodvtn ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:05:53 np0005593294.novalocal sudo[7395]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:53 np0005593294.novalocal python3[7397]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:05:53 np0005593294.novalocal sudo[7395]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:54 np0005593294.novalocal sudo[7468]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-juhkbqglvvnbjxknxllwimrenxzeskui ; OS_CLOUD=vexxhost /usr/bin/python3'
Jan 23 09:05:54 np0005593294.novalocal sudo[7468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:05:54 np0005593294.novalocal python3[7470]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159153.5164309-373-212687453147454/source _original_basename=tmp62uv7bel follow=False checksum=6e1e8970cf6ad2f0b1a32d462d71e8a0528ec2d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:05:54 np0005593294.novalocal sudo[7468]: pam_unix(sudo:session): session closed for user root
Jan 23 09:05:57 np0005593294.novalocal sshd-session[7319]: Connection closed by 38.102.83.114 port 55900
Jan 23 09:05:57 np0005593294.novalocal sshd-session[7316]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:05:57 np0005593294.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Jan 23 09:05:57 np0005593294.novalocal systemd-logind[807]: Session 4 logged out. Waiting for processes to exit.
Jan 23 09:05:57 np0005593294.novalocal systemd-logind[807]: Removed session 4.
Jan 23 09:06:57 np0005593294.novalocal systemd[4325]: Created slice User Background Tasks Slice.
Jan 23 09:06:57 np0005593294.novalocal systemd[4325]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 09:06:57 np0005593294.novalocal systemd[4325]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 09:14:49 np0005593294.novalocal sshd-session[7502]: Accepted publickey for zuul from 38.102.83.114 port 39136 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:14:49 np0005593294.novalocal systemd-logind[807]: New session 5 of user zuul.
Jan 23 09:14:49 np0005593294.novalocal systemd[1]: Started Session 5 of User zuul.
Jan 23 09:14:49 np0005593294.novalocal sshd-session[7502]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:14:49 np0005593294.novalocal sudo[7529]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yccqvugovzjwozjojnacuremltqqzzzq ; /usr/bin/python3'
Jan 23 09:14:49 np0005593294.novalocal sudo[7529]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:50 np0005593294.novalocal python3[7531]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-5353-1fb2-00000000217f-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:50 np0005593294.novalocal sudo[7529]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:50 np0005593294.novalocal sudo[7558]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bdgmovxuesitygbdilugribhqsjyfxvy ; /usr/bin/python3'
Jan 23 09:14:50 np0005593294.novalocal sudo[7558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:50 np0005593294.novalocal python3[7560]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:50 np0005593294.novalocal sudo[7558]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:50 np0005593294.novalocal sudo[7584]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcplxlzsfmzgyguamhvitlygqbladilv ; /usr/bin/python3'
Jan 23 09:14:50 np0005593294.novalocal sudo[7584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:50 np0005593294.novalocal python3[7586]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:50 np0005593294.novalocal sudo[7584]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:50 np0005593294.novalocal sudo[7610]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nqyeopdnvvdbadssqyvbbpjfehxdxjge ; /usr/bin/python3'
Jan 23 09:14:50 np0005593294.novalocal sudo[7610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:51 np0005593294.novalocal python3[7612]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:51 np0005593294.novalocal sudo[7610]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:51 np0005593294.novalocal sudo[7636]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afjqkinyjvcbavdwopujbvxvxklrfxqw ; /usr/bin/python3'
Jan 23 09:14:51 np0005593294.novalocal sudo[7636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:51 np0005593294.novalocal python3[7638]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:51 np0005593294.novalocal sudo[7636]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:51 np0005593294.novalocal sudo[7662]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plplunrwqsmouqixjbhwcsexwkskjgiw ; /usr/bin/python3'
Jan 23 09:14:51 np0005593294.novalocal sudo[7662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:52 np0005593294.novalocal python3[7664]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:52 np0005593294.novalocal sudo[7662]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:52 np0005593294.novalocal sudo[7740]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ttubnpbyhsqreutdlhzzadjuzrztxjvs ; /usr/bin/python3'
Jan 23 09:14:52 np0005593294.novalocal sudo[7740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:52 np0005593294.novalocal python3[7742]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:14:52 np0005593294.novalocal sudo[7740]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:52 np0005593294.novalocal sudo[7813]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jsaagguyiffyhiorcrbdsljrhrshiytb ; /usr/bin/python3'
Jan 23 09:14:52 np0005593294.novalocal sudo[7813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:53 np0005593294.novalocal python3[7815]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159692.4885798-545-101184356020038/source _original_basename=tmp9yq7mp4q follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:14:53 np0005593294.novalocal sudo[7813]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:54 np0005593294.novalocal sudo[7863]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjuaxbynjkzjvjaajmbvsupfiohqrxis ; /usr/bin/python3'
Jan 23 09:14:54 np0005593294.novalocal sudo[7863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:54 np0005593294.novalocal python3[7865]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 09:14:54 np0005593294.novalocal systemd[1]: Reloading.
Jan 23 09:14:54 np0005593294.novalocal systemd-rc-local-generator[7887]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:14:54 np0005593294.novalocal sudo[7863]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:56 np0005593294.novalocal sudo[7919]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqqefkmtynztckchmsqizyxbwhtuodsd ; /usr/bin/python3'
Jan 23 09:14:56 np0005593294.novalocal sudo[7919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:56 np0005593294.novalocal python3[7921]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Jan 23 09:14:56 np0005593294.novalocal sudo[7919]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:56 np0005593294.novalocal sudo[7945]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oarlxjyiwpqwbfogzgisjzwahezzfcos ; /usr/bin/python3'
Jan 23 09:14:56 np0005593294.novalocal sudo[7945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:56 np0005593294.novalocal python3[7947]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:57 np0005593294.novalocal sudo[7945]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:57 np0005593294.novalocal sudo[7973]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fbscnsafqsuqfeyxqljulynyytpwzlrz ; /usr/bin/python3'
Jan 23 09:14:57 np0005593294.novalocal sudo[7973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:57 np0005593294.novalocal python3[7975]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:57 np0005593294.novalocal sudo[7973]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:57 np0005593294.novalocal sudo[8001]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-przeayqdaditebuivezoiceybrzftdug ; /usr/bin/python3'
Jan 23 09:14:57 np0005593294.novalocal sudo[8001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:57 np0005593294.novalocal python3[8003]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:57 np0005593294.novalocal sudo[8001]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:57 np0005593294.novalocal sudo[8029]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zuzpdzoxwuvxtzcbfzhfzmnqnchlahto ; /usr/bin/python3'
Jan 23 09:14:57 np0005593294.novalocal sudo[8029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:14:57 np0005593294.novalocal python3[8031]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:57 np0005593294.novalocal sudo[8029]: pam_unix(sudo:session): session closed for user root
Jan 23 09:14:58 np0005593294.novalocal python3[8058]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-5353-1fb2-000000002186-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:14:59 np0005593294.novalocal python3[8088]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 09:15:03 np0005593294.novalocal sshd-session[7505]: Connection closed by 38.102.83.114 port 39136
Jan 23 09:15:03 np0005593294.novalocal sshd-session[7502]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:15:03 np0005593294.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Jan 23 09:15:03 np0005593294.novalocal systemd[1]: session-5.scope: Consumed 3.835s CPU time.
Jan 23 09:15:03 np0005593294.novalocal systemd-logind[807]: Session 5 logged out. Waiting for processes to exit.
Jan 23 09:15:03 np0005593294.novalocal systemd-logind[807]: Removed session 5.
Jan 23 09:15:05 np0005593294.novalocal sshd-session[8095]: Accepted publickey for zuul from 38.102.83.114 port 54358 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:15:05 np0005593294.novalocal systemd-logind[807]: New session 6 of user zuul.
Jan 23 09:15:05 np0005593294.novalocal systemd[1]: Started Session 6 of User zuul.
Jan 23 09:15:05 np0005593294.novalocal sshd-session[8095]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:15:05 np0005593294.novalocal sudo[8122]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghyjafbpybmwppdfvsqyuyacyyiqupzs ; /usr/bin/python3'
Jan 23 09:15:05 np0005593294.novalocal sudo[8122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:15:05 np0005593294.novalocal python3[8124]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 09:15:13 np0005593294.novalocal setsebool[8167]: The virt_use_nfs policy boolean was changed to 1 by root
Jan 23 09:15:13 np0005593294.novalocal setsebool[8167]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Jan 23 09:15:13 np0005593294.novalocal irqbalance[791]: Cannot change IRQ 27 affinity: Operation not permitted
Jan 23 09:15:13 np0005593294.novalocal irqbalance[791]: IRQ 27 affinity is now unmanaged
Jan 23 09:15:29 np0005593294.novalocal kernel: SELinux:  Converting 386 SID table entries...
Jan 23 09:15:29 np0005593294.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:15:29 np0005593294.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 23 09:15:29 np0005593294.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:15:29 np0005593294.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:15:29 np0005593294.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:15:29 np0005593294.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:15:29 np0005593294.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:15:42 np0005593294.novalocal kernel: SELinux:  Converting 389 SID table entries...
Jan 23 09:15:42 np0005593294.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:15:42 np0005593294.novalocal kernel: SELinux:  policy capability open_perms=1
Jan 23 09:15:42 np0005593294.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:15:42 np0005593294.novalocal kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:15:42 np0005593294.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:15:42 np0005593294.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:15:42 np0005593294.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:15:57 np0005593294.novalocal dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 09:15:57 np0005593294.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Jan 23 09:15:57 np0005593294.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Jan 23 09:15:57 np0005593294.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Jan 23 09:15:57 np0005593294.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Jan 23 09:16:02 np0005593294.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:16:02 np0005593294.novalocal systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:16:02 np0005593294.novalocal systemd[1]: Reloading.
Jan 23 09:16:02 np0005593294.novalocal systemd-rc-local-generator[8934]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:16:02 np0005593294.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:16:06 np0005593294.novalocal sudo[8122]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:15 np0005593294.novalocal python3[16348]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-f136-f057-00000000000c-1-compute1 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:16:18 np0005593294.novalocal kernel: evm: overlay not supported
Jan 23 09:16:20 np0005593294.novalocal systemd[4325]: Starting D-Bus User Message Bus...
Jan 23 09:16:20 np0005593294.novalocal dbus-broker-launch[17548]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Jan 23 09:16:20 np0005593294.novalocal dbus-broker-launch[17548]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Jan 23 09:16:20 np0005593294.novalocal systemd[4325]: Started D-Bus User Message Bus.
Jan 23 09:16:20 np0005593294.novalocal dbus-broker-lau[17548]: Ready
Jan 23 09:16:20 np0005593294.novalocal systemd[4325]: selinux: avc:  op=load_policy lsm=selinux seqno=4 res=1
Jan 23 09:16:20 np0005593294.novalocal systemd[4325]: Created slice Slice /user.
Jan 23 09:16:20 np0005593294.novalocal systemd[4325]: podman-16860.scope: unit configures an IP firewall, but not running as root.
Jan 23 09:16:20 np0005593294.novalocal systemd[4325]: (This warning is only shown for the first unit using IP firewalling.)
Jan 23 09:16:20 np0005593294.novalocal systemd[4325]: Started podman-16860.scope.
Jan 23 09:16:20 np0005593294.novalocal systemd[4325]: Started podman-pause-3e59ae14.scope.
Jan 23 09:16:20 np0005593294.novalocal sshd-session[17730]: Connection closed by 45.148.10.240 port 36264
Jan 23 09:16:20 np0005593294.novalocal sshd-session[8098]: Connection closed by 38.102.83.114 port 54358
Jan 23 09:16:20 np0005593294.novalocal sshd-session[8095]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:16:20 np0005593294.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Jan 23 09:16:20 np0005593294.novalocal systemd[1]: session-6.scope: Consumed 49.240s CPU time.
Jan 23 09:16:20 np0005593294.novalocal systemd-logind[807]: Session 6 logged out. Waiting for processes to exit.
Jan 23 09:16:20 np0005593294.novalocal systemd-logind[807]: Removed session 6.
Jan 23 09:16:36 np0005593294.novalocal sshd-session[23927]: Connection closed by 38.129.56.17 port 38358 [preauth]
Jan 23 09:16:36 np0005593294.novalocal sshd-session[23933]: Connection closed by 38.129.56.17 port 38364 [preauth]
Jan 23 09:16:36 np0005593294.novalocal sshd-session[23929]: Unable to negotiate with 38.129.56.17 port 38378: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Jan 23 09:16:36 np0005593294.novalocal sshd-session[23930]: Unable to negotiate with 38.129.56.17 port 38388: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Jan 23 09:16:36 np0005593294.novalocal sshd-session[23935]: Unable to negotiate with 38.129.56.17 port 38390: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Jan 23 09:16:41 np0005593294.novalocal sshd-session[25907]: Accepted publickey for zuul from 38.102.83.114 port 57496 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:16:41 np0005593294.novalocal systemd-logind[807]: New session 7 of user zuul.
Jan 23 09:16:41 np0005593294.novalocal systemd[1]: Started Session 7 of User zuul.
Jan 23 09:16:41 np0005593294.novalocal sshd-session[25907]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:16:42 np0005593294.novalocal python3[26035]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:16:42 np0005593294.novalocal sudo[26237]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eteqwhdbsenmjkjyqyrqjseeunlusjto ; /usr/bin/python3'
Jan 23 09:16:42 np0005593294.novalocal sudo[26237]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:42 np0005593294.novalocal python3[26249]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:16:42 np0005593294.novalocal sudo[26237]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:43 np0005593294.novalocal sudo[26602]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdrcrwrumsqtbavrbotfibemzznihlpt ; /usr/bin/python3'
Jan 23 09:16:43 np0005593294.novalocal sudo[26602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:43 np0005593294.novalocal python3[26612]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005593294.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Jan 23 09:16:43 np0005593294.novalocal useradd[26681]: new group: name=cloud-admin, GID=1002
Jan 23 09:16:43 np0005593294.novalocal useradd[26681]: new user: name=cloud-admin, UID=1002, GID=1002, home=/home/cloud-admin, shell=/bin/bash, from=none
Jan 23 09:16:44 np0005593294.novalocal sudo[26602]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:44 np0005593294.novalocal sudo[27146]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxiqvmbmwrqcjohpgfqyhvrkvfcsvteu ; /usr/bin/python3'
Jan 23 09:16:44 np0005593294.novalocal sudo[27146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:44 np0005593294.novalocal python3[27156]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBIXU6aMT27gF+Yfs/YZWwo3YepWSGuQLHNXTSuo3za5wTzqiDdK4Z0aI/Vfz5yHXRMPrH9UNJkm8FGQwkK4yHMQ= zuul@np0005593292.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Jan 23 09:16:44 np0005593294.novalocal sudo[27146]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:45 np0005593294.novalocal sudo[27272]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjkwkmiyuylqfeigbqtofvbnecnwqplv ; /usr/bin/python3'
Jan 23 09:16:45 np0005593294.novalocal sudo[27272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:45 np0005593294.novalocal python3[27274]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:16:45 np0005593294.novalocal sudo[27272]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:45 np0005593294.novalocal sudo[27433]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxwwbmutbhgviiyzstlbtkrajhpkcfqx ; /usr/bin/python3'
Jan 23 09:16:45 np0005593294.novalocal sudo[27433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:45 np0005593294.novalocal python3[27446]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769159804.7710958-151-62106867819167/source _original_basename=tmp1c5_jq48 follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:16:45 np0005593294.novalocal sudo[27433]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:46 np0005593294.novalocal sudo[27757]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghzdpjzuxizxrzjlzhuyhrtqqfttscbg ; /usr/bin/python3'
Jan 23 09:16:46 np0005593294.novalocal sudo[27757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:16:46 np0005593294.novalocal python3[27766]: ansible-ansible.builtin.hostname Invoked with name=compute-1 use=systemd
Jan 23 09:16:46 np0005593294.novalocal systemd[1]: Starting Hostname Service...
Jan 23 09:16:46 np0005593294.novalocal systemd[1]: Started Hostname Service.
Jan 23 09:16:46 np0005593294.novalocal systemd-hostnamed[27875]: Changed pretty hostname to 'compute-1'
Jan 23 09:16:46 compute-1 systemd-hostnamed[27875]: Hostname set to <compute-1> (static)
Jan 23 09:16:46 compute-1 NetworkManager[7216]: <info>  [1769159806.6029] hostname: static hostname changed from "np0005593294.novalocal" to "compute-1"
Jan 23 09:16:46 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:16:46 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:16:46 compute-1 sudo[27757]: pam_unix(sudo:session): session closed for user root
Jan 23 09:16:47 compute-1 sshd-session[25975]: Connection closed by 38.102.83.114 port 57496
Jan 23 09:16:47 compute-1 sshd-session[25907]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:16:47 compute-1 systemd[1]: session-7.scope: Deactivated successfully.
Jan 23 09:16:47 compute-1 systemd[1]: session-7.scope: Consumed 2.476s CPU time.
Jan 23 09:16:47 compute-1 systemd-logind[807]: Session 7 logged out. Waiting for processes to exit.
Jan 23 09:16:47 compute-1 systemd-logind[807]: Removed session 7.
Jan 23 09:16:56 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:16:56 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:16:56 compute-1 systemd[1]: man-db-cache-update.service: Consumed 53.471s CPU time.
Jan 23 09:16:56 compute-1 systemd[1]: run-ra6e1f36ccf3f47abb7de2cd3ca88c949.service: Deactivated successfully.
Jan 23 09:16:56 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:17:16 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 09:18:58 compute-1 sshd-session[29935]: Invalid user sol from 45.148.10.240 port 56588
Jan 23 09:18:58 compute-1 sshd-session[29935]: Connection closed by invalid user sol 45.148.10.240 port 56588 [preauth]
Jan 23 09:21:20 compute-1 sshd-session[29938]: Invalid user solana from 45.148.10.240 port 57376
Jan 23 09:21:20 compute-1 sshd-session[29938]: Connection closed by invalid user solana 45.148.10.240 port 57376 [preauth]
Jan 23 09:21:34 compute-1 sshd-session[29940]: Invalid user  from 194.187.176.203 port 33266
Jan 23 09:21:35 compute-1 sshd-session[29940]: Connection closed by invalid user  194.187.176.203 port 33266 [preauth]
Jan 23 09:21:46 compute-1 sshd-session[29942]: Accepted publickey for zuul from 38.129.56.17 port 33294 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:21:46 compute-1 systemd-logind[807]: New session 8 of user zuul.
Jan 23 09:21:46 compute-1 systemd[1]: Started Session 8 of User zuul.
Jan 23 09:21:46 compute-1 sshd-session[29942]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:21:47 compute-1 python3[30018]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:21:48 compute-1 sudo[30132]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezewroidstibocjlfgpphrqhtrviamdo ; /usr/bin/python3'
Jan 23 09:21:48 compute-1 sudo[30132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:48 compute-1 python3[30134]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:49 compute-1 sudo[30132]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:49 compute-1 sudo[30205]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsqzqrrhfrdembtapjejhvvinfggzfef ; /usr/bin/python3'
Jan 23 09:21:49 compute-1 sudo[30205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:49 compute-1 python3[30207]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:49 compute-1 sudo[30205]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:49 compute-1 sudo[30231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvheddgnopnxfmkgpmltxpfiqjgfbped ; /usr/bin/python3'
Jan 23 09:21:49 compute-1 sudo[30231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:49 compute-1 python3[30233]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:49 compute-1 sudo[30231]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:49 compute-1 sudo[30304]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wfiidyuzfjtrracvpkdobmjqwxiptymw ; /usr/bin/python3'
Jan 23 09:21:49 compute-1 sudo[30304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:50 compute-1 python3[30306]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:50 compute-1 sudo[30304]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:50 compute-1 sudo[30330]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xibveetogckpdpwwrxaaxadahknfulyk ; /usr/bin/python3'
Jan 23 09:21:50 compute-1 sudo[30330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:50 compute-1 python3[30332]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:50 compute-1 sudo[30330]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:50 compute-1 sudo[30403]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itownqetlprvbthuiqnniyzvjmssgexr ; /usr/bin/python3'
Jan 23 09:21:50 compute-1 sudo[30403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:50 compute-1 python3[30405]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:50 compute-1 sudo[30403]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:50 compute-1 sudo[30429]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbyvjqmtexrjkoxwkcricttexdbmbtme ; /usr/bin/python3'
Jan 23 09:21:50 compute-1 sudo[30429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:50 compute-1 python3[30431]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:51 compute-1 sudo[30429]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:51 compute-1 sudo[30502]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mgntqatdmhgscpkcmxvijquvhhmonzrn ; /usr/bin/python3'
Jan 23 09:21:51 compute-1 sudo[30502]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:51 compute-1 python3[30504]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:51 compute-1 sudo[30502]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:51 compute-1 sudo[30528]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcxhdzcoygibbltriqexvwwyyjvtlyjh ; /usr/bin/python3'
Jan 23 09:21:51 compute-1 sudo[30528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:51 compute-1 python3[30530]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:51 compute-1 sudo[30528]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:51 compute-1 sudo[30601]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-digapmoxxnkoxeaafzpgzitricaisamf ; /usr/bin/python3'
Jan 23 09:21:51 compute-1 sudo[30601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:51 compute-1 python3[30603]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:51 compute-1 sudo[30601]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:52 compute-1 sudo[30627]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qepzvhvpymaiikjybzmknxlpkdgrdssi ; /usr/bin/python3'
Jan 23 09:21:52 compute-1 sudo[30627]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:52 compute-1 python3[30629]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:52 compute-1 sudo[30627]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:52 compute-1 sudo[30700]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-leqtrbpmlovbqnxpctpyfudemolirddh ; /usr/bin/python3'
Jan 23 09:21:52 compute-1 sudo[30700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:52 compute-1 python3[30702]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:52 compute-1 sudo[30700]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:52 compute-1 sudo[30726]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bcrhdxokiozadekzlaqpzyyvfwaxjnek ; /usr/bin/python3'
Jan 23 09:21:52 compute-1 sudo[30726]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:52 compute-1 python3[30728]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:21:52 compute-1 sudo[30726]: pam_unix(sudo:session): session closed for user root
Jan 23 09:21:52 compute-1 sudo[30799]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfpxiuhyandgbirdnpizwfjugklvxuig ; /usr/bin/python3'
Jan 23 09:21:52 compute-1 sudo[30799]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:21:53 compute-1 python3[30801]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1769160108.701896-34064-74879226716954/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:21:53 compute-1 sudo[30799]: pam_unix(sudo:session): session closed for user root
Jan 23 09:22:05 compute-1 python3[30850]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:23:40 compute-1 sshd-session[30853]: Invalid user sol from 45.148.10.240 port 49762
Jan 23 09:23:40 compute-1 sshd-session[30853]: Connection closed by invalid user sol 45.148.10.240 port 49762 [preauth]
Jan 23 09:25:56 compute-1 sshd-session[30856]: Invalid user ubuntu from 45.148.10.240 port 39118
Jan 23 09:25:56 compute-1 sshd-session[30856]: Connection closed by invalid user ubuntu 45.148.10.240 port 39118 [preauth]
Jan 23 09:27:05 compute-1 sshd-session[29945]: Received disconnect from 38.129.56.17 port 33294:11: disconnected by user
Jan 23 09:27:05 compute-1 sshd-session[29945]: Disconnected from user zuul 38.129.56.17 port 33294
Jan 23 09:27:05 compute-1 sshd-session[29942]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:27:05 compute-1 systemd[1]: session-8.scope: Deactivated successfully.
Jan 23 09:27:05 compute-1 systemd[1]: session-8.scope: Consumed 4.809s CPU time.
Jan 23 09:27:05 compute-1 systemd-logind[807]: Session 8 logged out. Waiting for processes to exit.
Jan 23 09:27:05 compute-1 systemd-logind[807]: Removed session 8.
Jan 23 09:28:10 compute-1 sshd-session[30861]: Invalid user ubuntu from 45.148.10.240 port 39544
Jan 23 09:28:10 compute-1 sshd-session[30861]: Connection closed by invalid user ubuntu 45.148.10.240 port 39544 [preauth]
Jan 23 09:30:23 compute-1 sshd-session[30863]: Invalid user sol from 45.148.10.240 port 42978
Jan 23 09:30:23 compute-1 sshd-session[30863]: Connection closed by invalid user sol 45.148.10.240 port 42978 [preauth]
Jan 23 09:32:37 compute-1 sshd-session[30865]: Invalid user solana from 45.148.10.240 port 55550
Jan 23 09:32:37 compute-1 sshd-session[30865]: Connection closed by invalid user solana 45.148.10.240 port 55550 [preauth]
Jan 23 09:34:53 compute-1 sshd-session[30869]: Invalid user solana from 45.148.10.240 port 34760
Jan 23 09:34:53 compute-1 sshd-session[30869]: Connection closed by invalid user solana 45.148.10.240 port 34760 [preauth]
Jan 23 09:36:57 compute-1 systemd[1]: Starting dnf makecache...
Jan 23 09:36:57 compute-1 dnf[30871]: Failed determining last makecache time.
Jan 23 09:36:57 compute-1 dnf[30871]: delorean-openstack-barbican-42b4c41831408a8e323 375 kB/s |  13 kB     00:00
Jan 23 09:36:57 compute-1 dnf[30871]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 2.4 MB/s |  65 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.3 MB/s |  32 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-python-stevedore-c4acc5639fd2329372142 4.7 MB/s | 131 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.2 MB/s |  32 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-os-refresh-config-9bfc52b5049be2d8de61  11 MB/s | 349 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 386 kB/s |  42 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-python-designate-tests-tempest-347fdbc 551 kB/s |  18 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-openstack-glance-1fd12c29b339f30fe823e 517 kB/s |  18 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.1 MB/s |  29 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-openstack-manila-3c01b7181572c95dac462 1.0 MB/s |  25 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-python-whitebox-neutron-tests-tempest- 5.0 MB/s | 154 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-openstack-octavia-ba397f07a7331190208c 975 kB/s |  26 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-openstack-watcher-c014f81a8647287f6dcc 641 kB/s |  16 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-ansible-config_template-5ccaa22121a7ff 310 kB/s | 7.4 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 4.3 MB/s | 144 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-openstack-swift-dc98a8463506ac520c469a 541 kB/s |  14 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-python-tempestconf-8515371b7cceebd4282 2.2 MB/s |  53 kB     00:00
Jan 23 09:36:58 compute-1 dnf[30871]: delorean-openstack-heat-ui-013accbfd179753bc3f0 2.8 MB/s |  96 kB     00:00
Jan 23 09:36:59 compute-1 dnf[30871]: CentOS Stream 9 - BaseOS                         66 kB/s | 6.7 kB     00:00
Jan 23 09:36:59 compute-1 dnf[30871]: CentOS Stream 9 - AppStream                      67 kB/s | 6.8 kB     00:00
Jan 23 09:36:59 compute-1 dnf[30871]: CentOS Stream 9 - CRB                            55 kB/s | 6.6 kB     00:00
Jan 23 09:36:59 compute-1 dnf[30871]: CentOS Stream 9 - Extras packages                69 kB/s | 7.3 kB     00:00
Jan 23 09:36:59 compute-1 dnf[30871]: dlrn-antelope-testing                            26 MB/s | 1.1 MB     00:00
Jan 23 09:37:00 compute-1 dnf[30871]: dlrn-antelope-build-deps                         10 MB/s | 461 kB     00:00
Jan 23 09:37:00 compute-1 dnf[30871]: centos9-rabbitmq                                7.5 MB/s | 123 kB     00:00
Jan 23 09:37:00 compute-1 dnf[30871]: centos9-storage                                  20 MB/s | 415 kB     00:00
Jan 23 09:37:00 compute-1 dnf[30871]: centos9-opstools                                3.7 MB/s |  51 kB     00:00
Jan 23 09:37:00 compute-1 dnf[30871]: NFV SIG OpenvSwitch                              19 MB/s | 461 kB     00:00
Jan 23 09:37:01 compute-1 dnf[30871]: repo-setup-centos-appstream                      89 MB/s |  26 MB     00:00
Jan 23 09:37:07 compute-1 dnf[30871]: repo-setup-centos-baseos                         62 MB/s | 8.9 MB     00:00
Jan 23 09:37:09 compute-1 dnf[30871]: repo-setup-centos-highavailability               29 MB/s | 744 kB     00:00
Jan 23 09:37:09 compute-1 dnf[30871]: repo-setup-centos-powertools                     64 MB/s | 7.6 MB     00:00
Jan 23 09:37:11 compute-1 sshd-session[30972]: Invalid user sol from 45.148.10.240 port 35152
Jan 23 09:37:12 compute-1 sshd-session[30972]: Connection closed by invalid user sol 45.148.10.240 port 35152 [preauth]
Jan 23 09:37:12 compute-1 dnf[30871]: Extra Packages for Enterprise Linux 9 - x86_64   18 MB/s |  20 MB     00:01
Jan 23 09:37:13 compute-1 sshd-session[30974]: Accepted publickey for zuul from 192.168.122.30 port 48970 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:37:13 compute-1 systemd-logind[807]: New session 9 of user zuul.
Jan 23 09:37:14 compute-1 systemd[1]: Started Session 9 of User zuul.
Jan 23 09:37:14 compute-1 sshd-session[30974]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:37:15 compute-1 python3.9[31127]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:37:16 compute-1 sudo[31306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwpoeodwrwheccswcqcmfdlltqvwmcet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161035.722668-52-13777070372264/AnsiballZ_command.py'
Jan 23 09:37:16 compute-1 sudo[31306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:16 compute-1 python3.9[31308]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:37:23 compute-1 sudo[31306]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:26 compute-1 dnf[30871]: Metadata cache created.
Jan 23 09:37:26 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 09:37:26 compute-1 systemd[1]: Finished dnf makecache.
Jan 23 09:37:26 compute-1 systemd[1]: dnf-makecache.service: Consumed 26.725s CPU time.
Jan 23 09:37:29 compute-1 sshd-session[30977]: Connection closed by 192.168.122.30 port 48970
Jan 23 09:37:29 compute-1 sshd-session[30974]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:37:29 compute-1 systemd[1]: session-9.scope: Deactivated successfully.
Jan 23 09:37:29 compute-1 systemd[1]: session-9.scope: Consumed 8.179s CPU time.
Jan 23 09:37:29 compute-1 systemd-logind[807]: Session 9 logged out. Waiting for processes to exit.
Jan 23 09:37:29 compute-1 systemd-logind[807]: Removed session 9.
Jan 23 09:37:46 compute-1 sshd-session[31368]: Accepted publickey for zuul from 192.168.122.30 port 60956 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:37:46 compute-1 systemd-logind[807]: New session 10 of user zuul.
Jan 23 09:37:46 compute-1 systemd[1]: Started Session 10 of User zuul.
Jan 23 09:37:46 compute-1 sshd-session[31368]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:37:47 compute-1 python3.9[31521]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 09:37:49 compute-1 python3.9[31695]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:37:49 compute-1 sudo[31845]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hxgrnizjlbgyumkymtowonrpnnatliyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161069.3342621-89-249600866529981/AnsiballZ_command.py'
Jan 23 09:37:49 compute-1 sudo[31845]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:49 compute-1 python3.9[31847]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:37:50 compute-1 sudo[31845]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:51 compute-1 sudo[31998]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfcerwydrnliwnasiangvotdjqdmnnse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161070.4192908-125-127380700806155/AnsiballZ_stat.py'
Jan 23 09:37:51 compute-1 sudo[31998]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:51 compute-1 python3.9[32000]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:37:51 compute-1 sudo[31998]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:52 compute-1 sudo[32150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-stkgkjxcefwrxqycuxaccllmsupfimut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161071.5588615-149-2333646537444/AnsiballZ_file.py'
Jan 23 09:37:52 compute-1 sudo[32150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:52 compute-1 python3.9[32152]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:37:52 compute-1 sudo[32150]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:52 compute-1 sudo[32302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vmyjujbqlceacvgtjqdmbxxdhfrovomt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161072.4290197-173-99330198734100/AnsiballZ_stat.py'
Jan 23 09:37:52 compute-1 sudo[32302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:52 compute-1 python3.9[32304]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:37:52 compute-1 sudo[32302]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:53 compute-1 sudo[32425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uckngjwoqrwzinvqkwwqubuhrejhhmye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161072.4290197-173-99330198734100/AnsiballZ_copy.py'
Jan 23 09:37:53 compute-1 sudo[32425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:53 compute-1 python3.9[32427]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161072.4290197-173-99330198734100/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:37:53 compute-1 sudo[32425]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:54 compute-1 sudo[32577]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xrfhoqtoptbxjsggqfoqfyvtfpleagqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161074.1414878-218-210414166788736/AnsiballZ_setup.py'
Jan 23 09:37:54 compute-1 sudo[32577]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:54 compute-1 python3.9[32579]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:37:54 compute-1 sudo[32577]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:55 compute-1 sudo[32733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbqlkjpdyietzfncnkiedrmzzuoevmie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161075.147616-242-105325963757906/AnsiballZ_file.py'
Jan 23 09:37:55 compute-1 sudo[32733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:55 compute-1 python3.9[32735]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:37:55 compute-1 sudo[32733]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:56 compute-1 sudo[32885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-itvrmfhpulsuyddgfwtxnfadnjpnnats ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161075.9031289-269-266900162993091/AnsiballZ_file.py'
Jan 23 09:37:56 compute-1 sudo[32885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:37:56 compute-1 python3.9[32887]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:37:56 compute-1 sudo[32885]: pam_unix(sudo:session): session closed for user root
Jan 23 09:37:57 compute-1 python3.9[33037]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:38:00 compute-1 python3.9[33290]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:38:01 compute-1 python3.9[33440]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:38:03 compute-1 python3.9[33594]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:38:04 compute-1 sudo[33750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmiuzwvmfmgnrafvbfhdnzouwcxirfeh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161083.7465644-414-171987134831860/AnsiballZ_setup.py'
Jan 23 09:38:04 compute-1 sudo[33750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:38:04 compute-1 python3.9[33752]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:38:04 compute-1 sudo[33750]: pam_unix(sudo:session): session closed for user root
Jan 23 09:38:05 compute-1 sudo[33834]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlkskdrzxzthjaxtullpglylcudtccsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161083.7465644-414-171987134831860/AnsiballZ_dnf.py'
Jan 23 09:38:05 compute-1 sudo[33834]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:38:05 compute-1 python3.9[33836]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:38:38 compute-1 systemd[1]: Reloading.
Jan 23 09:38:38 compute-1 systemd-rc-local-generator[34036]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:38:38 compute-1 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Jan 23 09:38:38 compute-1 systemd[1]: Reloading.
Jan 23 09:38:38 compute-1 systemd-rc-local-generator[34079]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:38:39 compute-1 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Jan 23 09:38:39 compute-1 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Jan 23 09:38:39 compute-1 systemd[1]: Reloading.
Jan 23 09:38:39 compute-1 systemd-rc-local-generator[34118]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:38:39 compute-1 systemd[1]: Listening on LVM2 poll daemon socket.
Jan 23 09:38:39 compute-1 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 09:38:39 compute-1 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 09:38:39 compute-1 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 09:39:29 compute-1 sshd-session[34281]: Invalid user sol from 45.148.10.240 port 53708
Jan 23 09:39:29 compute-1 sshd-session[34281]: Connection closed by invalid user sol 45.148.10.240 port 53708 [preauth]
Jan 23 09:39:48 compute-1 kernel: SELinux:  Converting 2724 SID table entries...
Jan 23 09:39:48 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:39:48 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 23 09:39:48 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:39:48 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:39:48 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:39:48 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:39:48 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:39:49 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Jan 23 09:39:49 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:39:49 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:39:49 compute-1 systemd[1]: Reloading.
Jan 23 09:39:49 compute-1 systemd-rc-local-generator[34435]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:39:49 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:39:50 compute-1 sudo[33834]: pam_unix(sudo:session): session closed for user root
Jan 23 09:39:51 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:39:51 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:39:51 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.462s CPU time.
Jan 23 09:39:51 compute-1 systemd[1]: run-r9f3775315b3e4c6b987812721a80dfba.service: Deactivated successfully.
Jan 23 09:40:32 compute-1 sudo[35345]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mdgebwslkgliecprahqhmivrtymnypjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161231.6893804-450-128951400301027/AnsiballZ_command.py'
Jan 23 09:40:32 compute-1 sudo[35345]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:32 compute-1 python3.9[35347]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:40:33 compute-1 sudo[35345]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:34 compute-1 sudo[35626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgvswmaqrolzseiljxrdzmlfxcxfdjyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161233.470227-473-177142819974825/AnsiballZ_selinux.py'
Jan 23 09:40:34 compute-1 sudo[35626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:34 compute-1 python3.9[35628]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 09:40:34 compute-1 sudo[35626]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:35 compute-1 sudo[35778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-feeqmvjrcxvsxcdkzavsriidkbiyyulo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161235.2777781-506-210540760205816/AnsiballZ_command.py'
Jan 23 09:40:35 compute-1 sudo[35778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:35 compute-1 python3.9[35780]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 09:40:36 compute-1 sudo[35778]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:39 compute-1 sudo[35931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-asgztvarqfnskfyedmycazpjjnvfibox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161239.576081-530-36713906931710/AnsiballZ_file.py'
Jan 23 09:40:39 compute-1 sudo[35931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:40 compute-1 python3.9[35933]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:40:40 compute-1 sudo[35931]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:41 compute-1 sudo[36083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsxpxzdsvybnlgsbzppyqoosknfgpiqa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161240.975889-554-31884958553453/AnsiballZ_mount.py'
Jan 23 09:40:41 compute-1 sudo[36083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:41 compute-1 python3.9[36085]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 09:40:41 compute-1 sudo[36083]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:45 compute-1 sudo[36235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kehnedbatywbfyjqdmsdvundzoggcvqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161245.4368742-638-47543572560373/AnsiballZ_file.py'
Jan 23 09:40:45 compute-1 sudo[36235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:45 compute-1 python3.9[36237]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:40:45 compute-1 sudo[36235]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:49 compute-1 sudo[36387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hctbwmkxsxlbtydaoivchiehgmwwymml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161248.7896843-662-194883927395366/AnsiballZ_stat.py'
Jan 23 09:40:49 compute-1 sudo[36387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:49 compute-1 python3.9[36389]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:40:49 compute-1 sudo[36387]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:50 compute-1 sudo[36510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynejkqlnuvtlerfbdxdlviiwclfbfrec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161248.7896843-662-194883927395366/AnsiballZ_copy.py'
Jan 23 09:40:50 compute-1 sudo[36510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:50 compute-1 python3.9[36512]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161248.7896843-662-194883927395366/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:40:50 compute-1 sudo[36510]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:56 compute-1 sudo[36662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dyujzqlsnumddyfohxcwbnwktqpoouzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161255.8022115-734-44016160906823/AnsiballZ_stat.py'
Jan 23 09:40:56 compute-1 sudo[36662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:56 compute-1 python3.9[36664]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:40:56 compute-1 sudo[36662]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:56 compute-1 sudo[36814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcwfhcomlzprkxkxsbbmsxymykwrcnij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161256.4653697-758-128484692777212/AnsiballZ_command.py'
Jan 23 09:40:56 compute-1 sudo[36814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:57 compute-1 python3.9[36816]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:40:57 compute-1 sudo[36814]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:57 compute-1 sudo[36967]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kydrjdnkgvcgrrpqctzborcldletkpyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161257.4949338-782-170289973573578/AnsiballZ_file.py'
Jan 23 09:40:57 compute-1 sudo[36967]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:57 compute-1 python3.9[36969]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:40:58 compute-1 sudo[36967]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:58 compute-1 sudo[37119]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-irsrdsemcvbkbdkgjezounrpebwfgeim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161258.4193892-815-102836303215832/AnsiballZ_getent.py'
Jan 23 09:40:58 compute-1 sudo[37119]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:40:59 compute-1 python3.9[37121]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 09:40:59 compute-1 sudo[37119]: pam_unix(sudo:session): session closed for user root
Jan 23 09:40:59 compute-1 sudo[37272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnswurtetsjkkykvylcsfffbqffuikms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161259.4070873-839-203496359843571/AnsiballZ_group.py'
Jan 23 09:40:59 compute-1 sudo[37272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:00 compute-1 python3.9[37274]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:41:00 compute-1 groupadd[37275]: group added to /etc/group: name=qemu, GID=107
Jan 23 09:41:00 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:41:00 compute-1 groupadd[37275]: group added to /etc/gshadow: name=qemu
Jan 23 09:41:00 compute-1 groupadd[37275]: new group: name=qemu, GID=107
Jan 23 09:41:00 compute-1 sudo[37272]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:01 compute-1 sudo[37431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmcmgktbleafxngvlwcomgapzrrjpiwi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161260.6501527-863-266534503287140/AnsiballZ_user.py'
Jan 23 09:41:01 compute-1 sudo[37431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:01 compute-1 python3.9[37433]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 09:41:01 compute-1 useradd[37435]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 09:41:01 compute-1 sudo[37431]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:02 compute-1 sudo[37591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fdoqgsckotzqrrzfevaqofhxakovjudo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161262.1794531-887-163097507383562/AnsiballZ_getent.py'
Jan 23 09:41:02 compute-1 sudo[37591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:02 compute-1 python3.9[37593]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 09:41:02 compute-1 sudo[37591]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:03 compute-1 sudo[37744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkzovojmmbjbhksgxisdxiwkbjoqvthr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161262.9958766-911-131019096005113/AnsiballZ_group.py'
Jan 23 09:41:03 compute-1 sudo[37744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:03 compute-1 python3.9[37746]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:41:03 compute-1 groupadd[37747]: group added to /etc/group: name=hugetlbfs, GID=42477
Jan 23 09:41:03 compute-1 groupadd[37747]: group added to /etc/gshadow: name=hugetlbfs
Jan 23 09:41:03 compute-1 groupadd[37747]: new group: name=hugetlbfs, GID=42477
Jan 23 09:41:03 compute-1 sudo[37744]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:04 compute-1 sudo[37902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cablvnaudciiagtudrssypqdzzsfcbda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161264.0529947-938-211263600814316/AnsiballZ_file.py'
Jan 23 09:41:04 compute-1 sudo[37902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:04 compute-1 python3.9[37904]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 09:41:04 compute-1 sudo[37902]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:05 compute-1 sudo[38054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bqexydgwkwaqsykawdgtdzfawdntzywm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161265.1892529-971-59161037662604/AnsiballZ_dnf.py'
Jan 23 09:41:05 compute-1 sudo[38054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:06 compute-1 python3.9[38056]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:41:07 compute-1 sudo[38054]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:11 compute-1 sudo[38207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rmriajshcxoaiiifrffavytdntgspjca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161271.0547447-995-34041453751766/AnsiballZ_file.py'
Jan 23 09:41:11 compute-1 sudo[38207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:11 compute-1 python3.9[38209]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:41:11 compute-1 sudo[38207]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:12 compute-1 sudo[38359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qwfjddhgupnmwuxljplbwmkxezynpldx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161271.7468174-1019-30385254113136/AnsiballZ_stat.py'
Jan 23 09:41:12 compute-1 sudo[38359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:12 compute-1 python3.9[38361]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:41:12 compute-1 sudo[38359]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:12 compute-1 sudo[38482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tsbwwymbiqxspnxsodjthgrzolrhljqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161271.7468174-1019-30385254113136/AnsiballZ_copy.py'
Jan 23 09:41:12 compute-1 sudo[38482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:12 compute-1 python3.9[38484]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161271.7468174-1019-30385254113136/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:41:12 compute-1 sudo[38482]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:13 compute-1 irqbalance[791]: Cannot change IRQ 26 affinity: Operation not permitted
Jan 23 09:41:13 compute-1 irqbalance[791]: IRQ 26 affinity is now unmanaged
Jan 23 09:41:13 compute-1 sudo[38634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xkaqgctlfacqqnijnsitdymojkdejxwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161273.007913-1064-225656054600842/AnsiballZ_systemd.py'
Jan 23 09:41:13 compute-1 sudo[38634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:13 compute-1 python3.9[38636]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:41:13 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 23 09:41:13 compute-1 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Jan 23 09:41:13 compute-1 kernel: Bridge firewalling registered
Jan 23 09:41:13 compute-1 systemd-modules-load[38640]: Inserted module 'br_netfilter'
Jan 23 09:41:13 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 23 09:41:14 compute-1 sudo[38634]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:14 compute-1 sudo[38794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbsqpnoxetlhoskiqyjhqeybllcseddy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161274.202263-1088-42987376531943/AnsiballZ_stat.py'
Jan 23 09:41:14 compute-1 sudo[38794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:14 compute-1 python3.9[38796]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:41:14 compute-1 sudo[38794]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:15 compute-1 sudo[38917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xjcrcejshyslptckylcisajnpwpjaeth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161274.202263-1088-42987376531943/AnsiballZ_copy.py'
Jan 23 09:41:15 compute-1 sudo[38917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:15 compute-1 python3.9[38919]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161274.202263-1088-42987376531943/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:41:15 compute-1 sudo[38917]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:16 compute-1 sudo[39069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lexqwarxvbjlwmxqjdmzlfkcisjzdrrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161275.7464433-1142-66028555082952/AnsiballZ_dnf.py'
Jan 23 09:41:16 compute-1 sudo[39069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:16 compute-1 python3.9[39071]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:41:19 compute-1 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 09:41:19 compute-1 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 09:41:20 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:41:20 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:41:20 compute-1 systemd[1]: Reloading.
Jan 23 09:41:20 compute-1 systemd-rc-local-generator[39134]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:41:20 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:41:20 compute-1 sudo[39069]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:23 compute-1 python3.9[41978]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:41:23 compute-1 python3.9[42770]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 09:41:24 compute-1 python3.9[43088]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:41:25 compute-1 sudo[43238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bozwyxgkrdwwemnujcrhtzagpfsbtfif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161285.02161-1259-246114016933895/AnsiballZ_command.py'
Jan 23 09:41:25 compute-1 sudo[43238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:25 compute-1 python3.9[43240]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:25 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 09:41:25 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:41:25 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:41:25 compute-1 systemd[1]: man-db-cache-update.service: Consumed 5.047s CPU time.
Jan 23 09:41:25 compute-1 systemd[1]: run-rbafa2d86e4ae41b2ab845ad0ab743bb3.service: Deactivated successfully.
Jan 23 09:41:26 compute-1 systemd[1]: Starting Authorization Manager...
Jan 23 09:41:26 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 09:41:26 compute-1 polkitd[43458]: Started polkitd version 0.117
Jan 23 09:41:26 compute-1 polkitd[43458]: Loading rules from directory /etc/polkit-1/rules.d
Jan 23 09:41:26 compute-1 polkitd[43458]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 23 09:41:26 compute-1 polkitd[43458]: Finished loading, compiling and executing 2 rules
Jan 23 09:41:26 compute-1 polkitd[43458]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Jan 23 09:41:26 compute-1 systemd[1]: Started Authorization Manager.
Jan 23 09:41:26 compute-1 sudo[43238]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:27 compute-1 sudo[43626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhguxybcarskxwbotmyqyrhyxlrugapv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161287.2619927-1286-109057174589085/AnsiballZ_systemd.py'
Jan 23 09:41:27 compute-1 sudo[43626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:27 compute-1 python3.9[43628]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:41:27 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 09:41:27 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 09:41:27 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 09:41:27 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 09:41:28 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 09:41:28 compute-1 sudo[43626]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:29 compute-1 python3.9[43790]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 09:41:32 compute-1 sudo[43940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qezzjnfeagvkynonvdvgzccgqubahgff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161292.1598442-1457-24700829114966/AnsiballZ_systemd.py'
Jan 23 09:41:32 compute-1 sudo[43940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:32 compute-1 python3.9[43942]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:41:32 compute-1 systemd[1]: Reloading.
Jan 23 09:41:32 compute-1 systemd-rc-local-generator[43971]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:41:33 compute-1 sudo[43940]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:33 compute-1 sudo[44130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kkggxfjbbwfffyljesfleqpmwmxvlmwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161293.2350624-1457-78884965191019/AnsiballZ_systemd.py'
Jan 23 09:41:33 compute-1 sudo[44130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:33 compute-1 python3.9[44132]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:41:33 compute-1 systemd[1]: Reloading.
Jan 23 09:41:33 compute-1 systemd-rc-local-generator[44154]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:41:34 compute-1 sudo[44130]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:34 compute-1 sudo[44319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dkbohgbnqwcvbzfwpyktnpwhhwraqhbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161294.4647343-1505-62512308237649/AnsiballZ_command.py'
Jan 23 09:41:34 compute-1 sudo[44319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:34 compute-1 python3.9[44321]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:35 compute-1 sudo[44319]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:35 compute-1 sudo[44472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mltucjddlrpcbkpdfbxrapxfuhqgnvlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161295.3820255-1529-207464250031668/AnsiballZ_command.py'
Jan 23 09:41:35 compute-1 sudo[44472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:35 compute-1 python3.9[44474]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:35 compute-1 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Jan 23 09:41:35 compute-1 sudo[44472]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:36 compute-1 sudo[44625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcpktlosketnclvlhpddwilldvqbzdzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161296.155994-1553-122752775570663/AnsiballZ_command.py'
Jan 23 09:41:36 compute-1 sudo[44625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:36 compute-1 python3.9[44627]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:38 compute-1 sudo[44625]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:38 compute-1 sudo[44787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htvsxnubfhsffiyznxqgcdlblosayxnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161298.400285-1577-215975398255297/AnsiballZ_command.py'
Jan 23 09:41:38 compute-1 sudo[44787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:38 compute-1 python3.9[44789]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:41:38 compute-1 sudo[44787]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:39 compute-1 sudo[44940]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogyfuqgemhvsfjfqdrkibznywdngdzau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161299.1873064-1601-278066191389492/AnsiballZ_systemd.py'
Jan 23 09:41:39 compute-1 sudo[44940]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:39 compute-1 python3.9[44942]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:41:39 compute-1 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Jan 23 09:41:39 compute-1 systemd[1]: Stopped Apply Kernel Variables.
Jan 23 09:41:39 compute-1 systemd[1]: Stopping Apply Kernel Variables...
Jan 23 09:41:39 compute-1 systemd[1]: Starting Apply Kernel Variables...
Jan 23 09:41:39 compute-1 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Jan 23 09:41:39 compute-1 systemd[1]: Finished Apply Kernel Variables.
Jan 23 09:41:39 compute-1 sudo[44940]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:40 compute-1 sshd-session[31371]: Connection closed by 192.168.122.30 port 60956
Jan 23 09:41:40 compute-1 sshd-session[31368]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:41:40 compute-1 systemd[1]: session-10.scope: Deactivated successfully.
Jan 23 09:41:40 compute-1 systemd[1]: session-10.scope: Consumed 2min 8.519s CPU time.
Jan 23 09:41:40 compute-1 systemd-logind[807]: Session 10 logged out. Waiting for processes to exit.
Jan 23 09:41:40 compute-1 systemd-logind[807]: Removed session 10.
Jan 23 09:41:43 compute-1 sshd-session[44972]: Invalid user solana from 45.148.10.240 port 44514
Jan 23 09:41:43 compute-1 sshd-session[44972]: Connection closed by invalid user solana 45.148.10.240 port 44514 [preauth]
Jan 23 09:41:47 compute-1 sshd-session[44974]: Accepted publickey for zuul from 192.168.122.30 port 33452 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:41:47 compute-1 systemd-logind[807]: New session 11 of user zuul.
Jan 23 09:41:47 compute-1 systemd[1]: Started Session 11 of User zuul.
Jan 23 09:41:47 compute-1 sshd-session[44974]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:41:48 compute-1 python3.9[45127]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:41:49 compute-1 sudo[45281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kcvhcfikaymynvatvpjkbngggqluydym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161309.2074208-64-277252975366866/AnsiballZ_getent.py'
Jan 23 09:41:49 compute-1 sudo[45281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:50 compute-1 python3.9[45283]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 09:41:50 compute-1 sudo[45281]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:50 compute-1 sudo[45434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yaupnbsvwpszozyqgvqvzhuaivrlkszq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161310.4540634-88-236342957766/AnsiballZ_group.py'
Jan 23 09:41:50 compute-1 sudo[45434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:51 compute-1 python3.9[45436]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:41:51 compute-1 groupadd[45437]: group added to /etc/group: name=openvswitch, GID=42476
Jan 23 09:41:51 compute-1 groupadd[45437]: group added to /etc/gshadow: name=openvswitch
Jan 23 09:41:51 compute-1 groupadd[45437]: new group: name=openvswitch, GID=42476
Jan 23 09:41:51 compute-1 sudo[45434]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:52 compute-1 sudo[45592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aszvcyjynfhhyfewdtbofuamgdkxjxtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161311.8933427-112-24022359834443/AnsiballZ_user.py'
Jan 23 09:41:52 compute-1 sudo[45592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:52 compute-1 python3.9[45594]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 09:41:55 compute-1 useradd[45596]: new user: name=openvswitch, UID=42476, GID=42476, home=/home/openvswitch, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 09:41:55 compute-1 useradd[45596]: add 'openvswitch' to group 'hugetlbfs'
Jan 23 09:41:55 compute-1 useradd[45596]: add 'openvswitch' to shadow group 'hugetlbfs'
Jan 23 09:41:55 compute-1 sudo[45592]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:55 compute-1 sudo[45752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yunhubsgdzhvvgzzokylbospnpdfyskz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161315.4757607-142-188463724792308/AnsiballZ_setup.py'
Jan 23 09:41:55 compute-1 sudo[45752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:56 compute-1 python3.9[45754]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:41:56 compute-1 sudo[45752]: pam_unix(sudo:session): session closed for user root
Jan 23 09:41:56 compute-1 sudo[45836]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tckcfkbpvgpxsvekwnahvumdkvsjtyim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161315.4757607-142-188463724792308/AnsiballZ_dnf.py'
Jan 23 09:41:56 compute-1 sudo[45836]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:41:56 compute-1 python3.9[45838]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 09:41:59 compute-1 sudo[45836]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:00 compute-1 sudo[45999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-swkawalegxvwuteqmuklsdhcuuzpoxls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161320.1956334-184-148843248774268/AnsiballZ_dnf.py'
Jan 23 09:42:00 compute-1 sudo[45999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:00 compute-1 python3.9[46001]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:42:12 compute-1 kernel: SELinux:  Converting 2736 SID table entries...
Jan 23 09:42:12 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:42:12 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 23 09:42:12 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:42:12 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:42:12 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:42:12 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:42:12 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:42:13 compute-1 groupadd[46024]: group added to /etc/group: name=unbound, GID=994
Jan 23 09:42:13 compute-1 groupadd[46024]: group added to /etc/gshadow: name=unbound
Jan 23 09:42:13 compute-1 groupadd[46024]: new group: name=unbound, GID=994
Jan 23 09:42:13 compute-1 useradd[46031]: new user: name=unbound, UID=993, GID=994, home=/var/lib/unbound, shell=/sbin/nologin, from=none
Jan 23 09:42:13 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Jan 23 09:42:13 compute-1 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Jan 23 09:42:15 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:42:15 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:42:15 compute-1 systemd[1]: Reloading.
Jan 23 09:42:15 compute-1 systemd-rc-local-generator[46527]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:42:15 compute-1 systemd-sysv-generator[46533]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:42:15 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:42:16 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:42:16 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:42:16 compute-1 systemd[1]: run-r3da8bb381b17428c89ac1ac1d3e19566.service: Deactivated successfully.
Jan 23 09:42:16 compute-1 sudo[45999]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:24 compute-1 sudo[47097]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxlkapwoonnvwzxduzbjnrgzrkodbwyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161343.9849813-208-254752501594249/AnsiballZ_systemd.py'
Jan 23 09:42:24 compute-1 sudo[47097]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:24 compute-1 python3.9[47099]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:42:24 compute-1 systemd[1]: Reloading.
Jan 23 09:42:25 compute-1 systemd-sysv-generator[47133]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:42:25 compute-1 systemd-rc-local-generator[47130]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:42:25 compute-1 systemd[1]: Starting Open vSwitch Database Unit...
Jan 23 09:42:25 compute-1 chown[47141]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Jan 23 09:42:25 compute-1 ovs-ctl[47146]: /etc/openvswitch/conf.db does not exist ... (warning).
Jan 23 09:42:25 compute-1 ovs-ctl[47146]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Jan 23 09:42:25 compute-1 ovs-ctl[47146]: Starting ovsdb-server [  OK  ]
Jan 23 09:42:25 compute-1 ovs-vsctl[47195]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Jan 23 09:42:25 compute-1 ovs-vsctl[47211]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"170ec811-bf2b-4b3a-9339-50a49c79a1e6\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Jan 23 09:42:25 compute-1 ovs-ctl[47146]: Configuring Open vSwitch system IDs [  OK  ]
Jan 23 09:42:25 compute-1 ovs-ctl[47146]: Enabling remote OVSDB managers [  OK  ]
Jan 23 09:42:25 compute-1 systemd[1]: Started Open vSwitch Database Unit.
Jan 23 09:42:25 compute-1 ovs-vsctl[47221]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 23 09:42:25 compute-1 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Jan 23 09:42:25 compute-1 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Jan 23 09:42:25 compute-1 systemd[1]: Starting Open vSwitch Forwarding Unit...
Jan 23 09:42:25 compute-1 kernel: openvswitch: Open vSwitch switching datapath
Jan 23 09:42:25 compute-1 ovs-ctl[47266]: Inserting openvswitch module [  OK  ]
Jan 23 09:42:25 compute-1 ovs-ctl[47235]: Starting ovs-vswitchd [  OK  ]
Jan 23 09:42:25 compute-1 ovs-ctl[47235]: Enabling remote OVSDB managers [  OK  ]
Jan 23 09:42:25 compute-1 systemd[1]: Started Open vSwitch Forwarding Unit.
Jan 23 09:42:25 compute-1 ovs-vsctl[47283]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-1
Jan 23 09:42:25 compute-1 systemd[1]: Starting Open vSwitch...
Jan 23 09:42:25 compute-1 systemd[1]: Finished Open vSwitch.
Jan 23 09:42:25 compute-1 sudo[47097]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:27 compute-1 python3.9[47435]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:42:28 compute-1 sudo[47585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tozbqhqngmnwqdmztodcdknlluuzoqhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161347.8417954-262-177663305389277/AnsiballZ_sefcontext.py'
Jan 23 09:42:28 compute-1 sudo[47585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:28 compute-1 python3.9[47587]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 09:42:30 compute-1 kernel: SELinux:  Converting 2750 SID table entries...
Jan 23 09:42:30 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 09:42:30 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 23 09:42:30 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 09:42:30 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 23 09:42:30 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 09:42:30 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 09:42:30 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 09:42:30 compute-1 sudo[47585]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:31 compute-1 python3.9[47742]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:42:32 compute-1 sudo[47898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpnkaiftfmmtaszbmzkfogdqhkosbsim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161352.2926025-316-261953248018870/AnsiballZ_dnf.py'
Jan 23 09:42:32 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Jan 23 09:42:32 compute-1 sudo[47898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:32 compute-1 python3.9[47900]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:42:34 compute-1 sudo[47898]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:35 compute-1 sudo[48051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shoocncpqevuelacokxapfwxwuankusi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161354.8690195-340-193869936129677/AnsiballZ_command.py'
Jan 23 09:42:35 compute-1 sudo[48051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:35 compute-1 python3.9[48053]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:42:36 compute-1 sudo[48051]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:37 compute-1 sudo[48338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eztmvkldxianqjziiqhdimhcwraizymd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161356.6537213-364-36308796492260/AnsiballZ_file.py'
Jan 23 09:42:37 compute-1 sudo[48338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:37 compute-1 python3.9[48340]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 09:42:37 compute-1 sudo[48338]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:38 compute-1 python3.9[48490]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:42:38 compute-1 sudo[48642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aklwctgssqjwmneepabchlmlogxncxdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161358.446181-412-68302287020065/AnsiballZ_dnf.py'
Jan 23 09:42:38 compute-1 sudo[48642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:38 compute-1 python3.9[48644]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:42:40 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:42:40 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:42:40 compute-1 systemd[1]: Reloading.
Jan 23 09:42:40 compute-1 systemd-rc-local-generator[48682]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:42:40 compute-1 systemd-sysv-generator[48686]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:42:41 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:42:41 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:42:41 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:42:41 compute-1 systemd[1]: run-r2070ae03a98d45fbb5dd949472118885.service: Deactivated successfully.
Jan 23 09:42:41 compute-1 sudo[48642]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:44 compute-1 sudo[48958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njadhcmrsiwxhhrxwxoiljklmahirpwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161364.274503-436-122339818534380/AnsiballZ_systemd.py'
Jan 23 09:42:44 compute-1 sudo[48958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:45 compute-1 python3.9[48960]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:42:45 compute-1 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Jan 23 09:42:45 compute-1 systemd[1]: Stopped Network Manager Wait Online.
Jan 23 09:42:45 compute-1 systemd[1]: Stopping Network Manager Wait Online...
Jan 23 09:42:45 compute-1 systemd[1]: Stopping Network Manager...
Jan 23 09:42:45 compute-1 NetworkManager[7216]: <info>  [1769161365.1029] caught SIGTERM, shutting down normally.
Jan 23 09:42:45 compute-1 NetworkManager[7216]: <info>  [1769161365.1046] dhcp4 (eth0): canceled DHCP transaction
Jan 23 09:42:45 compute-1 NetworkManager[7216]: <info>  [1769161365.1046] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:42:45 compute-1 NetworkManager[7216]: <info>  [1769161365.1046] dhcp4 (eth0): state changed no lease
Jan 23 09:42:45 compute-1 NetworkManager[7216]: <info>  [1769161365.1048] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 09:42:45 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:42:45 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:42:45 compute-1 NetworkManager[7216]: <info>  [1769161365.2011] exiting (success)
Jan 23 09:42:45 compute-1 systemd[1]: NetworkManager.service: Deactivated successfully.
Jan 23 09:42:45 compute-1 systemd[1]: Stopped Network Manager.
Jan 23 09:42:45 compute-1 systemd[1]: NetworkManager.service: Consumed 14.428s CPU time, 4.1M memory peak, read 0B from disk, written 17.0K to disk.
Jan 23 09:42:45 compute-1 systemd[1]: Starting Network Manager...
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.2738] NetworkManager (version 1.54.3-2.el9) is starting... (after a restart, boot:0ec3f185-e60c-43ea-a74e-c21caf2508ae)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.2739] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.2806] manager[0x555d442be000]: monitoring kernel firmware directory '/lib/firmware'.
Jan 23 09:42:45 compute-1 systemd[1]: Starting Hostname Service...
Jan 23 09:42:45 compute-1 systemd[1]: Started Hostname Service.
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3653] hostname: hostname: using hostnamed
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3655] hostname: static hostname changed from (none) to "compute-1"
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3659] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3662] manager[0x555d442be000]: rfkill: Wi-Fi hardware radio set enabled
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3663] manager[0x555d442be000]: rfkill: WWAN hardware radio set enabled
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3682] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-ovs.so)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3691] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-device-plugin-team.so)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3692] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3692] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3693] manager: Networking is enabled by state file
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3695] settings: Loaded settings plugin: keyfile (internal)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3699] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.3-2.el9/libnm-settings-plugin-ifcfg-rh.so")
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3726] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3733] dhcp: init: Using DHCP client 'internal'
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3736] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3741] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3745] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3752] device (lo): Activation: starting connection 'lo' (6a1055b1-2674-4e8e-9fff-1fce9dcc1052)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3758] device (eth0): carrier: link connected
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3761] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3767] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3768] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3774] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3780] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3786] device (eth1): carrier: link connected
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3790] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3795] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50) (indicated)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3796] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3801] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3807] device (eth1): Activation: starting connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3812] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Jan 23 09:42:45 compute-1 systemd[1]: Started Network Manager.
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3819] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3822] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3827] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3830] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3833] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3836] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3838] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3851] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3861] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3865] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3874] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3890] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3903] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3905] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3910] device (lo): Activation: successful, device activated.
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3918] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3922] dhcp4 (eth0): state changed new lease, address=38.129.56.30
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3927] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3931] manager: NetworkManager state is now CONNECTED_LOCAL
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3933] device (eth1): Activation: successful, device activated.
Jan 23 09:42:45 compute-1 systemd[1]: Starting Network Manager Wait Online...
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.3951] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Jan 23 09:42:45 compute-1 sudo[48958]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.5601] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.5638] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.5640] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.5644] manager: NetworkManager state is now CONNECTED_SITE
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.5646] device (eth0): Activation: successful, device activated.
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.5651] manager: NetworkManager state is now CONNECTED_GLOBAL
Jan 23 09:42:45 compute-1 NetworkManager[48978]: <info>  [1769161365.5653] manager: startup complete
Jan 23 09:42:45 compute-1 systemd[1]: Finished Network Manager Wait Online.
Jan 23 09:42:46 compute-1 sudo[49184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfpijyjpnelygvokozqcsawvgnbzkhen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161365.7163756-460-139461224952087/AnsiballZ_dnf.py'
Jan 23 09:42:46 compute-1 sudo[49184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:42:46 compute-1 python3.9[49186]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:42:53 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 09:42:53 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 23 09:42:53 compute-1 systemd[1]: Reloading.
Jan 23 09:42:53 compute-1 systemd-rc-local-generator[49239]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:42:53 compute-1 systemd-sysv-generator[49243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:42:53 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 09:42:55 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 09:42:55 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 23 09:42:55 compute-1 systemd[1]: run-re0ba4bfdb04f412b91ccd3f0960499de.service: Deactivated successfully.
Jan 23 09:42:55 compute-1 sudo[49184]: pam_unix(sudo:session): session closed for user root
Jan 23 09:42:55 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:43:00 compute-1 sudo[49643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjqbbwxiaihjwpbhpvecoxsjwznjehkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161379.8718357-496-15975709820644/AnsiballZ_stat.py'
Jan 23 09:43:00 compute-1 sudo[49643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:00 compute-1 python3.9[49645]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:43:00 compute-1 sudo[49643]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:01 compute-1 sudo[49795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzwdejkcyrvafcwwsfjmndzhahcccgrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161380.5856102-523-10958221410707/AnsiballZ_ini_file.py'
Jan 23 09:43:01 compute-1 sudo[49795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:01 compute-1 python3.9[49797]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:01 compute-1 sudo[49795]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:01 compute-1 sudo[49949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zvizkoudepznsdvysowfxsywslaqgnaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161381.556736-553-44119366493929/AnsiballZ_ini_file.py'
Jan 23 09:43:01 compute-1 sudo[49949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:02 compute-1 python3.9[49951]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:02 compute-1 sudo[49949]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:02 compute-1 sudo[50101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcziratbzxmifxqrecnrmsyymdilqvre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161382.217925-553-63704336361136/AnsiballZ_ini_file.py'
Jan 23 09:43:02 compute-1 sudo[50101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:02 compute-1 python3.9[50103]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:02 compute-1 sudo[50101]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:03 compute-1 sudo[50253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-volixizquidtxssolxsbvkfayxspiycb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161382.918543-598-242475050189707/AnsiballZ_ini_file.py'
Jan 23 09:43:03 compute-1 sudo[50253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:03 compute-1 python3.9[50255]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:03 compute-1 sudo[50253]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:03 compute-1 sudo[50405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sjszvlmrqzkktbanhsbisxezbgpxysfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161383.5408463-598-121848748469884/AnsiballZ_ini_file.py'
Jan 23 09:43:03 compute-1 sudo[50405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:04 compute-1 python3.9[50407]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:04 compute-1 sudo[50405]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:04 compute-1 sudo[50557]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oemzliblzltbruigzqpkvrsnindsueho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161384.2408855-643-241763987821859/AnsiballZ_stat.py'
Jan 23 09:43:04 compute-1 sudo[50557]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:04 compute-1 python3.9[50559]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:04 compute-1 sudo[50557]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:05 compute-1 sudo[50680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ogwxrtgpcucaezwujcctixkehljxmbos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161384.2408855-643-241763987821859/AnsiballZ_copy.py'
Jan 23 09:43:05 compute-1 sudo[50680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:05 compute-1 python3.9[50682]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161384.2408855-643-241763987821859/.source _original_basename=.yiyg6a5s follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:05 compute-1 sudo[50680]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:05 compute-1 sudo[50832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jacltwfdfzccpsgjexxvxexvpddoodeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161385.7116432-688-14501591321464/AnsiballZ_file.py'
Jan 23 09:43:05 compute-1 sudo[50832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:06 compute-1 python3.9[50834]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:06 compute-1 sudo[50832]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:06 compute-1 sudo[50984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fxsjoqdhzfxeahskphbkvwecqtrxprjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161386.383674-712-48950730802966/AnsiballZ_edpm_os_net_config_mappings.py'
Jan 23 09:43:06 compute-1 sudo[50984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:07 compute-1 python3.9[50986]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Jan 23 09:43:07 compute-1 sudo[50984]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:07 compute-1 sudo[51136]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-igiqbtljbcyqbhayipyxtvbwsnczsptn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161387.313121-739-279778292259618/AnsiballZ_file.py'
Jan 23 09:43:07 compute-1 sudo[51136]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:07 compute-1 python3.9[51138]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:07 compute-1 sudo[51136]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:08 compute-1 sudo[51288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpghxfbazgxdafujitxhtabfnmsiqnnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161388.2300458-769-197088049086713/AnsiballZ_stat.py'
Jan 23 09:43:08 compute-1 sudo[51288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:08 compute-1 sudo[51288]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:09 compute-1 sudo[51411]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcrpfopcqybktrltpvwuqbuciolebwzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161388.2300458-769-197088049086713/AnsiballZ_copy.py'
Jan 23 09:43:09 compute-1 sudo[51411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:09 compute-1 sudo[51411]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:10 compute-1 sudo[51563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duyupdkmhndnflmyetsqvtgcdlvotcbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161389.7471454-814-186258899655664/AnsiballZ_slurp.py'
Jan 23 09:43:10 compute-1 sudo[51563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:10 compute-1 python3.9[51565]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Jan 23 09:43:10 compute-1 sudo[51563]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:11 compute-1 sudo[51738]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahrxosyxguadypenhxqsfhxkaavejpol ; ANSIBLE_ASYNC_DIR=\'~/.ansible_async\' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161390.6718836-841-8122361076958/async_wrapper.py j728794609016 300 /home/zuul/.ansible/tmp/ansible-tmp-1769161390.6718836-841-8122361076958/AnsiballZ_edpm_os_net_config.py _'
Jan 23 09:43:11 compute-1 sudo[51738]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:11 compute-1 ansible-async_wrapper.py[51740]: Invoked with j728794609016 300 /home/zuul/.ansible/tmp/ansible-tmp-1769161390.6718836-841-8122361076958/AnsiballZ_edpm_os_net_config.py _
Jan 23 09:43:11 compute-1 ansible-async_wrapper.py[51743]: Starting module and watcher
Jan 23 09:43:11 compute-1 ansible-async_wrapper.py[51743]: Start watching 51744 (300)
Jan 23 09:43:11 compute-1 ansible-async_wrapper.py[51744]: Start module (51744)
Jan 23 09:43:11 compute-1 ansible-async_wrapper.py[51740]: Return async_wrapper task started.
Jan 23 09:43:11 compute-1 sudo[51738]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:11 compute-1 python3.9[51745]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Jan 23 09:43:12 compute-1 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Jan 23 09:43:12 compute-1 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Jan 23 09:43:12 compute-1 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Jan 23 09:43:12 compute-1 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Jan 23 09:43:12 compute-1 kernel: cfg80211: failed to load regulatory.db
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7285] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7304] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7853] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7855] audit: op="connection-add" uuid="fcb52b2f-b59b-4641-8ef0-a8a3fe18cf9d" name="br-ex-br" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7869] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7870] audit: op="connection-add" uuid="b85b0313-6538-4d60-ae77-e76f4e59afd5" name="br-ex-port" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7880] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7881] audit: op="connection-add" uuid="83ff4c95-1394-48ce-bd5f-a5049c430383" name="eth1-port" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7891] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7892] audit: op="connection-add" uuid="c34547fa-5413-4521-a86b-c27c1e22e373" name="vlan20-port" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7903] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7904] audit: op="connection-add" uuid="fde3432a-6a8b-4ca7-a27a-9806a9829092" name="vlan21-port" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7914] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7915] audit: op="connection-add" uuid="8178505a-1b68-430c-82b2-c078deaaa866" name="vlan22-port" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7925] manager: (vlan23): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/10)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7926] audit: op="connection-add" uuid="b7d2bc4e-ab13-4715-bbc4-69fde53fe582" name="vlan23-port" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7943] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,connection.timestamp,ipv6.dhcp-timeout,ipv6.method,ipv6.addr-gen-mode" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7957] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.7958] audit: op="connection-add" uuid="15ca7997-d48c-49d0-811d-2a7146518225" name="br-ex-if" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8003] audit: op="connection-update" uuid="f0d37197-be61-575f-8210-b0dbd6f4eb50" name="ci-private-network" args="ovs-external-ids.data,ovs-interface.type,ipv4.dns,ipv4.addresses,ipv4.routes,ipv4.routing-rules,ipv4.method,ipv4.never-default,connection.port-type,connection.master,connection.controller,connection.slave-type,connection.timestamp,ipv6.dns,ipv6.addresses,ipv6.routes,ipv6.routing-rules,ipv6.method,ipv6.addr-gen-mode" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8025] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8027] audit: op="connection-add" uuid="441fef83-1900-4302-8a85-ab4614af5f62" name="vlan20-if" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8046] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8047] audit: op="connection-add" uuid="7aa698c4-7f45-4cde-9381-28f8d723332a" name="vlan21-if" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8066] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8067] audit: op="connection-add" uuid="21ce2276-23b7-471d-a22a-02b98a19bbe0" name="vlan22-if" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8087] manager: (vlan23): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/15)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8088] audit: op="connection-add" uuid="3a9b6f96-dcfa-44cc-acda-75f5e6b470a6" name="vlan23-if" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8101] audit: op="connection-delete" uuid="c9ce933b-996c-3254-bccc-8d3373d274f1" name="Wired connection 1" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8113] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8116] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Success
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8124] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8130] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (fcb52b2f-b59b-4641-8ef0-a8a3fe18cf9d)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8131] audit: op="connection-activate" uuid="fcb52b2f-b59b-4641-8ef0-a8a3fe18cf9d" name="br-ex-br" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8133] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8134] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8140] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8144] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (b85b0313-6538-4d60-ae77-e76f4e59afd5)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8146] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8147] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8151] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8155] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (83ff4c95-1394-48ce-bd5f-a5049c430383)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8157] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8158] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8164] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8169] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (c34547fa-5413-4521-a86b-c27c1e22e373)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8171] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8172] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8178] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8182] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (fde3432a-6a8b-4ca7-a27a-9806a9829092)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8184] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8185] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8192] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8197] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (8178505a-1b68-430c-82b2-c078deaaa866)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8199] device (vlan23)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8200] device (vlan23)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8206] device (vlan23)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8210] device (vlan23)[Open vSwitch Port]: Activation: starting connection 'vlan23-port' (b7d2bc4e-ab13-4715-bbc4-69fde53fe582)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8211] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8213] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8216] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8224] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8225] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8228] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8232] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (15ca7997-d48c-49d0-811d-2a7146518225)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8233] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8236] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8238] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8239] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8240] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8251] device (eth1): disconnecting for new activation request.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8251] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8254] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8256] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8257] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8260] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8261] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8264] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8268] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (441fef83-1900-4302-8a85-ab4614af5f62)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8269] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8271] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8273] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8274] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8277] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8278] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8281] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8285] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (7aa698c4-7f45-4cde-9381-28f8d723332a)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8286] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8290] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8291] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8292] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8296] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8297] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8300] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8306] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (21ce2276-23b7-471d-a22a-02b98a19bbe0)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8307] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8310] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8312] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8313] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8317] device (vlan23)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <warn>  [1769161393.8318] device (vlan23)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8324] device (vlan23)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8334] device (vlan23)[Open vSwitch Interface]: Activation: starting connection 'vlan23-if' (3a9b6f96-dcfa-44cc-acda-75f5e6b470a6)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8336] device (vlan23)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8341] device (vlan23)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8343] device (vlan23)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8345] device (vlan23)[Open vSwitch Port]: Activation: connection 'vlan23-port' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8347] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8363] audit: op="device-reapply" interface="eth0" ifindex=2 args="802-3-ethernet.mtu,ipv4.dhcp-timeout,ipv4.dhcp-client-id,connection.autoconnect-priority,ipv6.method,ipv6.addr-gen-mode" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8366] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8369] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8371] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8384] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8389] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8392] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8394] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8396] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8400] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 kernel: ovs-system: entered promiscuous mode
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8405] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8407] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8409] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8414] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8418] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8421] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8423] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8427] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8432] device (vlan23)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 kernel: Timeout policy base is empty
Jan 23 09:43:13 compute-1 systemd-udevd[51749]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8434] device (vlan23)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8439] device (vlan23)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8443] device (vlan23)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8448] dhcp4 (eth0): canceled DHCP transaction
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8448] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8448] dhcp4 (eth0): state changed no lease
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8450] dhcp4 (eth0): activation: beginning transaction (no timeout)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8461] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8470] audit: op="device-reapply" interface="eth1" ifindex=3 pid=51746 uid=0 result="fail" reason="Device is not activated"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8474] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8482] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8491] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8499] device (vlan23)[Open vSwitch Interface]: Activation: connection 'vlan23-if' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8501] dhcp4 (eth0): state changed new lease, address=38.129.56.30
Jan 23 09:43:13 compute-1 systemd[1]: Starting Network Manager Script Dispatcher Service...
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8562] device (eth1): disconnecting for new activation request.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8563] audit: op="connection-activate" uuid="f0d37197-be61-575f-8210-b0dbd6f4eb50" name="ci-private-network" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 systemd[1]: Started Network Manager Script Dispatcher Service.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8677] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8678] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8811] device (eth1): Activation: starting connection 'ci-private-network' (f0d37197-be61-575f-8210-b0dbd6f4eb50)
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8815] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8816] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8817] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8818] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8818] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8819] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8820] device (vlan23)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8825] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8827] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8832] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8836] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8839] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8841] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 kernel: br-ex: entered promiscuous mode
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8844] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8847] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8850] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8853] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8856] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8858] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8861] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8864] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8867] device (vlan23)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8870] device (vlan23)[Open vSwitch Port]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8882] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8899] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8902] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8940] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 kernel: vlan22: entered promiscuous mode
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8946] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.8950] device (eth1): Activation: successful, device activated.
Jan 23 09:43:13 compute-1 systemd-udevd[51751]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9005] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9020] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 kernel: vlan21: entered promiscuous mode
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9047] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9049] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9052] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9092] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9102] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9132] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9134] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9137] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 kernel: vlan23: entered promiscuous mode
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9174] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9188] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9217] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9218] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9221] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 kernel: vlan20: entered promiscuous mode
Jan 23 09:43:13 compute-1 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9281] device (vlan23)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9292] device (vlan23)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9317] device (vlan23)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9318] device (vlan23)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9323] device (vlan23)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9366] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9379] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9404] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9406] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Jan 23 09:43:13 compute-1 NetworkManager[48978]: <info>  [1769161393.9410] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.0511] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 09:43:15 compute-1 sudo[52101]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uinqxbzcdnwgpvmmnafijbvbjewqvyqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161394.6870978-841-277445673343862/AnsiballZ_async_status.py'
Jan 23 09:43:15 compute-1 sudo[52101]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.2161] checkpoint[0x555d44294950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.2167] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=51746 uid=0 result="success"
Jan 23 09:43:15 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 09:43:15 compute-1 python3.9[52103]: ansible-ansible.legacy.async_status Invoked with jid=j728794609016.51740 mode=status _async_dir=/root/.ansible_async
Jan 23 09:43:15 compute-1 sudo[52101]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.5173] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51746 uid=0 result="success"
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.5185] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51746 uid=0 result="success"
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.7595] audit: op="networking-control" arg="global-dns-configuration" pid=51746 uid=0 result="success"
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.7629] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.7656] audit: op="networking-control" arg="global-dns-configuration" pid=51746 uid=0 result="success"
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.8095] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51746 uid=0 result="success"
Jan 23 09:43:15 compute-1 NetworkManager[48978]: <info>  [1769161395.9999] checkpoint[0x555d44294a20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Jan 23 09:43:16 compute-1 NetworkManager[48978]: <info>  [1769161396.0003] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=51746 uid=0 result="success"
Jan 23 09:43:16 compute-1 ansible-async_wrapper.py[51744]: Module complete (51744)
Jan 23 09:43:16 compute-1 ansible-async_wrapper.py[51743]: Done in kid B.
Jan 23 09:43:18 compute-1 sudo[52210]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvntfrhrkoxpmxpoggrhwunugltdwkxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161394.6870978-841-277445673343862/AnsiballZ_async_status.py'
Jan 23 09:43:18 compute-1 sudo[52210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:18 compute-1 python3.9[52212]: ansible-ansible.legacy.async_status Invoked with jid=j728794609016.51740 mode=status _async_dir=/root/.ansible_async
Jan 23 09:43:18 compute-1 sudo[52210]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:19 compute-1 sudo[52310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwbwyvsbjfwardztlxprzeyzppnqqykc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161394.6870978-841-277445673343862/AnsiballZ_async_status.py'
Jan 23 09:43:19 compute-1 sudo[52310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:19 compute-1 python3.9[52312]: ansible-ansible.legacy.async_status Invoked with jid=j728794609016.51740 mode=cleanup _async_dir=/root/.ansible_async
Jan 23 09:43:19 compute-1 sudo[52310]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:20 compute-1 sudo[52462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jojsimshnyjunmdrbibwihdwqptcnksx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161399.784632-922-40190662143048/AnsiballZ_stat.py'
Jan 23 09:43:20 compute-1 sudo[52462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:20 compute-1 python3.9[52464]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:20 compute-1 sudo[52462]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:20 compute-1 sudo[52585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdhctbysehocosvozfynlhshaqiyygfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161399.784632-922-40190662143048/AnsiballZ_copy.py'
Jan 23 09:43:20 compute-1 sudo[52585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:20 compute-1 python3.9[52587]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161399.784632-922-40190662143048/.source.returncode _original_basename=.r_ria67y follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:20 compute-1 sudo[52585]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:21 compute-1 sudo[52737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evupkarunktsqphpmvspprlrddxcpboh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161401.1368065-970-98063642981915/AnsiballZ_stat.py'
Jan 23 09:43:21 compute-1 sudo[52737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:21 compute-1 python3.9[52739]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:21 compute-1 sudo[52737]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:22 compute-1 sudo[52860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gmcnruezjhvcneegddhhdtdpnmtxwlyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161401.1368065-970-98063642981915/AnsiballZ_copy.py'
Jan 23 09:43:22 compute-1 sudo[52860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:22 compute-1 python3.9[52862]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161401.1368065-970-98063642981915/.source.cfg _original_basename=.j53kx5k0 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:22 compute-1 sudo[52860]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:22 compute-1 sudo[53013]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhhvkhgoyunkyglakmhbgwzgmuzgpoak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161402.4589345-1015-29532378194090/AnsiballZ_systemd.py'
Jan 23 09:43:22 compute-1 sudo[53013]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:23 compute-1 python3.9[53015]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:43:23 compute-1 systemd[1]: Reloading Network Manager...
Jan 23 09:43:23 compute-1 NetworkManager[48978]: <info>  [1769161403.2076] audit: op="reload" arg="0" pid=53019 uid=0 result="success"
Jan 23 09:43:23 compute-1 NetworkManager[48978]: <info>  [1769161403.2082] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Jan 23 09:43:23 compute-1 systemd[1]: Reloaded Network Manager.
Jan 23 09:43:23 compute-1 sudo[53013]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:23 compute-1 sshd-session[44977]: Connection closed by 192.168.122.30 port 33452
Jan 23 09:43:23 compute-1 sshd-session[44974]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:43:23 compute-1 systemd[1]: session-11.scope: Deactivated successfully.
Jan 23 09:43:23 compute-1 systemd[1]: session-11.scope: Consumed 50.625s CPU time.
Jan 23 09:43:23 compute-1 systemd-logind[807]: Session 11 logged out. Waiting for processes to exit.
Jan 23 09:43:23 compute-1 systemd-logind[807]: Removed session 11.
Jan 23 09:43:29 compute-1 sshd-session[53050]: Accepted publickey for zuul from 192.168.122.30 port 56956 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:43:29 compute-1 systemd-logind[807]: New session 12 of user zuul.
Jan 23 09:43:29 compute-1 systemd[1]: Started Session 12 of User zuul.
Jan 23 09:43:29 compute-1 sshd-session[53050]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:43:30 compute-1 python3.9[53203]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:43:32 compute-1 python3.9[53358]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:43:33 compute-1 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Jan 23 09:43:33 compute-1 python3.9[53553]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:43:33 compute-1 sshd-session[53053]: Connection closed by 192.168.122.30 port 56956
Jan 23 09:43:33 compute-1 sshd-session[53050]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:43:33 compute-1 systemd-logind[807]: Session 12 logged out. Waiting for processes to exit.
Jan 23 09:43:33 compute-1 systemd[1]: session-12.scope: Deactivated successfully.
Jan 23 09:43:33 compute-1 systemd[1]: session-12.scope: Consumed 2.541s CPU time.
Jan 23 09:43:33 compute-1 systemd-logind[807]: Removed session 12.
Jan 23 09:43:39 compute-1 sshd-session[53581]: Accepted publickey for zuul from 192.168.122.30 port 58774 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:43:39 compute-1 systemd-logind[807]: New session 13 of user zuul.
Jan 23 09:43:39 compute-1 systemd[1]: Started Session 13 of User zuul.
Jan 23 09:43:39 compute-1 sshd-session[53581]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:43:40 compute-1 python3.9[53734]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:43:41 compute-1 python3.9[53888]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:43:42 compute-1 sudo[54043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bnugiwinxxemqcqmzgzprgmpgcicwcwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161421.9601593-76-154284686858366/AnsiballZ_setup.py'
Jan 23 09:43:42 compute-1 sudo[54043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:42 compute-1 python3.9[54045]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:43:42 compute-1 sudo[54043]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:43 compute-1 sudo[54127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hwhxqgewlnsomczfdvmdqiwupifibksa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161421.9601593-76-154284686858366/AnsiballZ_dnf.py'
Jan 23 09:43:43 compute-1 sudo[54127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:43 compute-1 python3.9[54129]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:43:44 compute-1 sudo[54127]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:46 compute-1 sudo[54281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wljxaetwutyfkcrmgaiqgdwgtuiirhdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161425.6684864-112-115268541904002/AnsiballZ_setup.py'
Jan 23 09:43:46 compute-1 sudo[54281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:46 compute-1 python3.9[54283]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:43:46 compute-1 sudo[54281]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:47 compute-1 sudo[54476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwefuzdbvpedoqvcpocidjcmzeuzuqdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161426.9239924-145-170194064114864/AnsiballZ_file.py'
Jan 23 09:43:47 compute-1 sudo[54476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:47 compute-1 python3.9[54478]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:47 compute-1 sudo[54476]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:48 compute-1 sudo[54628]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbtodlxlkygblpwknizywhllcixjfsym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161427.6998198-169-166355048860074/AnsiballZ_command.py'
Jan 23 09:43:48 compute-1 sudo[54628]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:48 compute-1 python3.9[54630]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:43:48 compute-1 podman[54631]: 2026-01-23 09:43:48.413618339 +0000 UTC m=+0.054328689 system refresh
Jan 23 09:43:48 compute-1 sudo[54628]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:49 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:43:49 compute-1 sudo[54789]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yqqnycbvhhgmsewavqxzukzcrmlzehta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161428.7140532-193-247961873399552/AnsiballZ_stat.py'
Jan 23 09:43:49 compute-1 sudo[54789]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:49 compute-1 python3.9[54791]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:49 compute-1 sudo[54789]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:50 compute-1 sudo[54912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zqbhtgnmgzwnychdiyajplkuphinqjny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161428.7140532-193-247961873399552/AnsiballZ_copy.py'
Jan 23 09:43:50 compute-1 sudo[54912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:50 compute-1 python3.9[54914]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161428.7140532-193-247961873399552/.source.json follow=False _original_basename=podman_network_config.j2 checksum=6d3b755833236f070a036449324fcb17c483d383 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:43:50 compute-1 sudo[54912]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:50 compute-1 sudo[55064]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dnqitseovtszyyxadopxpddctqsamnqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161430.539225-238-37214369718531/AnsiballZ_stat.py'
Jan 23 09:43:50 compute-1 sudo[55064]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:51 compute-1 python3.9[55066]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:43:51 compute-1 sudo[55064]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:51 compute-1 sudo[55187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbqlkflynbcathggacxwlbrqgxcqugdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161430.539225-238-37214369718531/AnsiballZ_copy.py'
Jan 23 09:43:51 compute-1 sudo[55187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:51 compute-1 python3.9[55189]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161430.539225-238-37214369718531/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:51 compute-1 sudo[55187]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:52 compute-1 sudo[55339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-omehxixibksoeavmxupzeavjxvwxhbue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161431.9770947-286-171449192263667/AnsiballZ_ini_file.py'
Jan 23 09:43:52 compute-1 sudo[55339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:52 compute-1 python3.9[55341]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:52 compute-1 sudo[55339]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:53 compute-1 sudo[55491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vouytyrdlzwaapnyjswdvghcpnoroqdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161432.7562025-286-56455217701230/AnsiballZ_ini_file.py'
Jan 23 09:43:53 compute-1 sudo[55491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:53 compute-1 python3.9[55493]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:53 compute-1 sudo[55491]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:53 compute-1 sudo[55643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efvfpaaccpqzymyjllvwboftbxekiuir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161433.4027689-286-125843607888349/AnsiballZ_ini_file.py'
Jan 23 09:43:53 compute-1 sudo[55643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:53 compute-1 python3.9[55645]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:53 compute-1 sudo[55643]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:54 compute-1 sudo[55795]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dababxrfilnrkzhejcbkraykzyuaxtyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161434.102409-286-169028828655879/AnsiballZ_ini_file.py'
Jan 23 09:43:54 compute-1 sudo[55795]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:54 compute-1 python3.9[55797]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:43:54 compute-1 sudo[55795]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:55 compute-1 sudo[55947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gfyqwmavgnftabfbsrghmrorccxifonv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161434.8778007-379-150336711090415/AnsiballZ_dnf.py'
Jan 23 09:43:55 compute-1 sudo[55947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:55 compute-1 python3.9[55949]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:43:56 compute-1 sudo[55947]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:57 compute-1 sshd-session[55951]: Invalid user solana from 45.148.10.240 port 55258
Jan 23 09:43:57 compute-1 sshd-session[55951]: Connection closed by invalid user solana 45.148.10.240 port 55258 [preauth]
Jan 23 09:43:57 compute-1 sudo[56102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okkhuabnuaqtsxdtnhomqezledvwkpkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161437.4070153-412-16074382275879/AnsiballZ_setup.py'
Jan 23 09:43:57 compute-1 sudo[56102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:58 compute-1 python3.9[56104]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:43:58 compute-1 sudo[56102]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:58 compute-1 sudo[56256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnklbaseaqjdpcwtaafoccpkoefpucam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161438.2440906-436-154141451176380/AnsiballZ_stat.py'
Jan 23 09:43:58 compute-1 sudo[56256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:58 compute-1 python3.9[56258]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:43:58 compute-1 sudo[56256]: pam_unix(sudo:session): session closed for user root
Jan 23 09:43:59 compute-1 sudo[56408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cgbnfqewicxnhrasewrumwhkeixpmxsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161438.997474-463-160089615614994/AnsiballZ_stat.py'
Jan 23 09:43:59 compute-1 sudo[56408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:43:59 compute-1 python3.9[56410]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:43:59 compute-1 sudo[56408]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:00 compute-1 sudo[56560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whirywnyoxxbgjnkcjzbudfodwdsdwmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161439.7993405-493-123783787710383/AnsiballZ_command.py'
Jan 23 09:44:00 compute-1 sudo[56560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:00 compute-1 python3.9[56562]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:44:00 compute-1 sudo[56560]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:01 compute-1 sudo[56713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xinssufgumktlhodmhrqvsirmzthylsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161440.5971203-523-248385865163970/AnsiballZ_service_facts.py'
Jan 23 09:44:01 compute-1 sudo[56713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:01 compute-1 python3.9[56715]: ansible-service_facts Invoked
Jan 23 09:44:01 compute-1 network[56732]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:44:01 compute-1 network[56733]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:44:01 compute-1 network[56734]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:44:03 compute-1 sudo[56713]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:05 compute-1 sudo[57017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcalxjovqfxfeqlkfjqvkcvriwubxnch ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769161445.0262194-568-73942607964813/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769161445.0262194-568-73942607964813/args'
Jan 23 09:44:05 compute-1 sudo[57017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:05 compute-1 sudo[57017]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:05 compute-1 sudo[57184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhtrzdkysfqmjepvnewojcknqyhwvsrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161445.7072618-601-190880676998839/AnsiballZ_dnf.py'
Jan 23 09:44:05 compute-1 sudo[57184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:06 compute-1 python3.9[57186]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:44:07 compute-1 sudo[57184]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:10 compute-1 sudo[57337]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfqhbuaaghhlflpfegqtmhbtcmckuxkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161450.0883796-640-154530863454841/AnsiballZ_package_facts.py'
Jan 23 09:44:10 compute-1 sudo[57337]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:10 compute-1 python3.9[57339]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 09:44:11 compute-1 sudo[57337]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:12 compute-1 sudo[57489]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wztgwvzgzevmujzxjdwefxcmjwiaxlhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161451.899059-671-721447290078/AnsiballZ_stat.py'
Jan 23 09:44:12 compute-1 sudo[57489]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:12 compute-1 python3.9[57491]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:12 compute-1 sudo[57489]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:12 compute-1 sudo[57614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijczmahlkpvpoxhbdrzgpqoxztnicopx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161451.899059-671-721447290078/AnsiballZ_copy.py'
Jan 23 09:44:12 compute-1 sudo[57614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:13 compute-1 python3.9[57616]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161451.899059-671-721447290078/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:13 compute-1 sudo[57614]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:13 compute-1 sudo[57768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzrctdyxqlzzaftnzrqtxsqfgnxsbvhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161453.2719088-716-245324001402472/AnsiballZ_stat.py'
Jan 23 09:44:13 compute-1 sudo[57768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:13 compute-1 python3.9[57770]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:13 compute-1 sudo[57768]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:14 compute-1 sudo[57893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hafpqlabhthjqjltrcyvhgztpdsnpwxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161453.2719088-716-245324001402472/AnsiballZ_copy.py'
Jan 23 09:44:14 compute-1 sudo[57893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:14 compute-1 python3.9[57895]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161453.2719088-716-245324001402472/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:14 compute-1 sudo[57893]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:15 compute-1 sudo[58047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urmofvgfxupbcebuwjixfidhuaztzyir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161455.5580106-780-41389515859666/AnsiballZ_lineinfile.py'
Jan 23 09:44:15 compute-1 sudo[58047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:16 compute-1 python3.9[58049]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:16 compute-1 sudo[58047]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:17 compute-1 sudo[58201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlmbictkkqqfetciwtwiigvsknvhizxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161457.4672873-823-211093408907650/AnsiballZ_setup.py'
Jan 23 09:44:17 compute-1 sudo[58201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:18 compute-1 python3.9[58203]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:44:18 compute-1 sudo[58201]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:18 compute-1 sudo[58285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ixloxaojqtgpiwwaukysalycacklxecd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161457.4672873-823-211093408907650/AnsiballZ_systemd.py'
Jan 23 09:44:18 compute-1 sudo[58285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:19 compute-1 python3.9[58287]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:44:19 compute-1 sudo[58285]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:20 compute-1 sudo[58439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekgckrjfcyhylkjecbgzazlydufeibbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161460.2553308-872-267271900779708/AnsiballZ_setup.py'
Jan 23 09:44:20 compute-1 sudo[58439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:20 compute-1 python3.9[58441]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:44:21 compute-1 sudo[58439]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:21 compute-1 sudo[58523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiixpnjgisaosyljiizyurhbvahvufsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161460.2553308-872-267271900779708/AnsiballZ_systemd.py'
Jan 23 09:44:21 compute-1 sudo[58523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:21 compute-1 python3.9[58525]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:44:21 compute-1 systemd[1]: Stopping NTP client/server...
Jan 23 09:44:21 compute-1 chronyd[781]: chronyd exiting
Jan 23 09:44:21 compute-1 systemd[1]: chronyd.service: Deactivated successfully.
Jan 23 09:44:21 compute-1 systemd[1]: Stopped NTP client/server.
Jan 23 09:44:21 compute-1 systemd[1]: Starting NTP client/server...
Jan 23 09:44:21 compute-1 chronyd[58533]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Jan 23 09:44:21 compute-1 chronyd[58533]: Frequency -31.645 +/- 0.148 ppm read from /var/lib/chrony/drift
Jan 23 09:44:21 compute-1 chronyd[58533]: Loaded seccomp filter (level 2)
Jan 23 09:44:21 compute-1 systemd[1]: Started NTP client/server.
Jan 23 09:44:21 compute-1 sudo[58523]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:22 compute-1 sshd-session[53584]: Connection closed by 192.168.122.30 port 58774
Jan 23 09:44:22 compute-1 sshd-session[53581]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:44:22 compute-1 systemd[1]: session-13.scope: Deactivated successfully.
Jan 23 09:44:22 compute-1 systemd[1]: session-13.scope: Consumed 26.864s CPU time.
Jan 23 09:44:22 compute-1 systemd-logind[807]: Session 13 logged out. Waiting for processes to exit.
Jan 23 09:44:22 compute-1 systemd-logind[807]: Removed session 13.
Jan 23 09:44:29 compute-1 sshd-session[58559]: Accepted publickey for zuul from 192.168.122.30 port 49822 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:44:29 compute-1 systemd-logind[807]: New session 14 of user zuul.
Jan 23 09:44:29 compute-1 systemd[1]: Started Session 14 of User zuul.
Jan 23 09:44:29 compute-1 sshd-session[58559]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:44:29 compute-1 sudo[58712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ccnyktnaczidczmgdrkeyonhvcbyfkwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161469.308943-22-60809925733285/AnsiballZ_file.py'
Jan 23 09:44:29 compute-1 sudo[58712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:29 compute-1 python3.9[58714]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:30 compute-1 sudo[58712]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:30 compute-1 sudo[58864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esdsgsxfjsddfmlowffvqtlvvmexjeak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161470.173351-58-281226781184061/AnsiballZ_stat.py'
Jan 23 09:44:30 compute-1 sudo[58864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:30 compute-1 python3.9[58866]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:30 compute-1 sudo[58864]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:31 compute-1 sudo[58987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rezrcovflaltselyjamemlfwarxyxxbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161470.173351-58-281226781184061/AnsiballZ_copy.py'
Jan 23 09:44:31 compute-1 sudo[58987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:31 compute-1 python3.9[58989]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/ceph-networks.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161470.173351-58-281226781184061/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=729ea8396013e3343245d6e934e0dcef55029ad2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:31 compute-1 sudo[58987]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:31 compute-1 sshd-session[58562]: Connection closed by 192.168.122.30 port 49822
Jan 23 09:44:31 compute-1 sshd-session[58559]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:44:31 compute-1 systemd[1]: session-14.scope: Deactivated successfully.
Jan 23 09:44:31 compute-1 systemd[1]: session-14.scope: Consumed 1.769s CPU time.
Jan 23 09:44:31 compute-1 systemd-logind[807]: Session 14 logged out. Waiting for processes to exit.
Jan 23 09:44:31 compute-1 systemd-logind[807]: Removed session 14.
Jan 23 09:44:37 compute-1 sshd-session[59015]: Accepted publickey for zuul from 192.168.122.30 port 45254 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:44:37 compute-1 systemd-logind[807]: New session 15 of user zuul.
Jan 23 09:44:37 compute-1 systemd[1]: Started Session 15 of User zuul.
Jan 23 09:44:37 compute-1 sshd-session[59015]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:44:38 compute-1 python3.9[59168]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:44:39 compute-1 sudo[59322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zivacnpksdqopjhlyrcdowabldfmqkgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161479.3119578-55-96381233706803/AnsiballZ_file.py'
Jan 23 09:44:39 compute-1 sudo[59322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:39 compute-1 python3.9[59324]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:40 compute-1 sudo[59322]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:40 compute-1 sudo[59497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwfxdzqngkkmgrhcfxwcczglbzzofmxj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161480.1817193-79-192264425773339/AnsiballZ_stat.py'
Jan 23 09:44:40 compute-1 sudo[59497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:40 compute-1 python3.9[59499]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:40 compute-1 sudo[59497]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:41 compute-1 sudo[59620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jdseeadarohuahsaoectufaarnajwuhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161480.1817193-79-192264425773339/AnsiballZ_copy.py'
Jan 23 09:44:41 compute-1 sudo[59620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:41 compute-1 python3.9[59622]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769161480.1817193-79-192264425773339/.source.json _original_basename=.hjzk4ntc follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:41 compute-1 sudo[59620]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:42 compute-1 sudo[59772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lixscmfifcwpsnhvjrccrovbclltgzqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161482.0245578-148-162811431184813/AnsiballZ_stat.py'
Jan 23 09:44:42 compute-1 sudo[59772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:42 compute-1 python3.9[59774]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:42 compute-1 sudo[59772]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:42 compute-1 sudo[59895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhykchdrhbpfmicwhmpaaobfmmsjcbsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161482.0245578-148-162811431184813/AnsiballZ_copy.py'
Jan 23 09:44:42 compute-1 sudo[59895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:43 compute-1 python3.9[59897]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161482.0245578-148-162811431184813/.source _original_basename=.xnr2vyk_ follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:43 compute-1 sudo[59895]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:43 compute-1 sudo[60047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dinkmgmiezwveaazlhlfknpmlflfaict ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161483.401535-196-178303330846559/AnsiballZ_file.py'
Jan 23 09:44:43 compute-1 sudo[60047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:43 compute-1 python3.9[60049]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:44:43 compute-1 sudo[60047]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:44 compute-1 sudo[60199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iuhqgrqpbzgrxhdqctylswznkefjdnfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161484.1323788-220-46440115688947/AnsiballZ_stat.py'
Jan 23 09:44:44 compute-1 sudo[60199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:44 compute-1 python3.9[60201]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:44 compute-1 sudo[60199]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:44 compute-1 sudo[60322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjmypmcdhpbwzeewsoswxmqjiywwtflm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161484.1323788-220-46440115688947/AnsiballZ_copy.py'
Jan 23 09:44:44 compute-1 sudo[60322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:45 compute-1 python3.9[60324]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161484.1323788-220-46440115688947/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:44:45 compute-1 sudo[60322]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:45 compute-1 sudo[60474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrcxgclgeqxtikzimtmfdguegzlopzuo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161485.273861-220-17542410538509/AnsiballZ_stat.py'
Jan 23 09:44:45 compute-1 sudo[60474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:45 compute-1 python3.9[60476]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:45 compute-1 sudo[60474]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:46 compute-1 sudo[60597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlzkyfovlxpsnmhjbyfrzcpcxisjfdzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161485.273861-220-17542410538509/AnsiballZ_copy.py'
Jan 23 09:44:46 compute-1 sudo[60597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:46 compute-1 python3.9[60599]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769161485.273861-220-17542410538509/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:44:46 compute-1 sudo[60597]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:47 compute-1 sudo[60749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvqynapwgmlyxywdrowaysaorucamqqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161487.0895486-307-22243575471957/AnsiballZ_file.py'
Jan 23 09:44:47 compute-1 sudo[60749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:47 compute-1 python3.9[60751]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:47 compute-1 sudo[60749]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:48 compute-1 sudo[60901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wewthxyfmvxytffsgilrjhsrdpnnbfnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161487.7552755-331-49845857292096/AnsiballZ_stat.py'
Jan 23 09:44:48 compute-1 sudo[60901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:48 compute-1 python3.9[60903]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:48 compute-1 sudo[60901]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:48 compute-1 sudo[61024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qnmyimsaowfvbmiyoihyqlcozeuxguvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161487.7552755-331-49845857292096/AnsiballZ_copy.py'
Jan 23 09:44:48 compute-1 sudo[61024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:48 compute-1 python3.9[61026]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161487.7552755-331-49845857292096/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:48 compute-1 sudo[61024]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:49 compute-1 sudo[61177]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihwucapjjiqgwdulsramjpyzxswezmcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161488.9764304-376-141887458440941/AnsiballZ_stat.py'
Jan 23 09:44:49 compute-1 sudo[61177]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:49 compute-1 python3.9[61179]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:49 compute-1 sudo[61177]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:49 compute-1 sudo[61300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpeeogbaedujsbwbfnelelwollzerjbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161488.9764304-376-141887458440941/AnsiballZ_copy.py'
Jan 23 09:44:49 compute-1 sudo[61300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:49 compute-1 python3.9[61302]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161488.9764304-376-141887458440941/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:50 compute-1 sudo[61300]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:50 compute-1 sudo[61452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpspxmeltgjwzeatljzvwsbixbqjvzvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161490.1821928-421-61149674881645/AnsiballZ_systemd.py'
Jan 23 09:44:50 compute-1 sudo[61452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:51 compute-1 python3.9[61454]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:44:51 compute-1 systemd[1]: Reloading.
Jan 23 09:44:51 compute-1 systemd-rc-local-generator[61481]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:44:51 compute-1 systemd-sysv-generator[61485]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:44:51 compute-1 systemd[1]: Reloading.
Jan 23 09:44:51 compute-1 systemd-rc-local-generator[61522]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:44:51 compute-1 systemd-sysv-generator[61526]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:44:51 compute-1 systemd[1]: Starting EDPM Container Shutdown...
Jan 23 09:44:51 compute-1 systemd[1]: Finished EDPM Container Shutdown.
Jan 23 09:44:51 compute-1 sudo[61452]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:52 compute-1 sudo[61681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tgdcijvhyplglnogdtxsxacyuzsudtic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161491.9637334-445-15161176188205/AnsiballZ_stat.py'
Jan 23 09:44:52 compute-1 sudo[61681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:52 compute-1 python3.9[61683]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:52 compute-1 sudo[61681]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:52 compute-1 sudo[61804]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofczkxzjyuqxkqbsdkhsohothzwphzzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161491.9637334-445-15161176188205/AnsiballZ_copy.py'
Jan 23 09:44:52 compute-1 sudo[61804]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:53 compute-1 python3.9[61806]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161491.9637334-445-15161176188205/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:53 compute-1 sudo[61804]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:53 compute-1 sudo[61956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ienabsquizujkppcobpaxslpmercaehq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161493.2674856-490-135175618482859/AnsiballZ_stat.py'
Jan 23 09:44:53 compute-1 sudo[61956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:53 compute-1 python3.9[61958]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:44:53 compute-1 sudo[61956]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:54 compute-1 sudo[62079]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-almeetspykxzufujypbnrftgqozcmcfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161493.2674856-490-135175618482859/AnsiballZ_copy.py'
Jan 23 09:44:54 compute-1 sudo[62079]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:54 compute-1 python3.9[62081]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161493.2674856-490-135175618482859/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:44:54 compute-1 sudo[62079]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:55 compute-1 sudo[62231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gswiavixomvuhnznzvnpqcqejzcsfsxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161494.764386-535-212063569528828/AnsiballZ_systemd.py'
Jan 23 09:44:55 compute-1 sudo[62231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:44:55 compute-1 python3.9[62233]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:44:55 compute-1 systemd[1]: Reloading.
Jan 23 09:44:55 compute-1 systemd-rc-local-generator[62261]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:44:55 compute-1 systemd-sysv-generator[62264]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:44:55 compute-1 systemd[1]: Reloading.
Jan 23 09:44:55 compute-1 systemd-rc-local-generator[62298]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:44:55 compute-1 systemd-sysv-generator[62302]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:44:55 compute-1 systemd[1]: Starting Create netns directory...
Jan 23 09:44:55 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 09:44:55 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 09:44:55 compute-1 systemd[1]: Finished Create netns directory.
Jan 23 09:44:55 compute-1 sudo[62231]: pam_unix(sudo:session): session closed for user root
Jan 23 09:44:56 compute-1 python3.9[62458]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:44:56 compute-1 network[62475]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:44:56 compute-1 network[62476]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:44:56 compute-1 network[62477]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:45:01 compute-1 anacron[2198]: Job `cron.daily' started
Jan 23 09:45:01 compute-1 sudo[62737]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjdifgalbzaxlenwlspcrvvltwoipsoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161500.9036043-583-229173958394806/AnsiballZ_systemd.py'
Jan 23 09:45:01 compute-1 sudo[62737]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:01 compute-1 anacron[2198]: Job `cron.daily' terminated
Jan 23 09:45:01 compute-1 python3.9[62740]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:45:01 compute-1 systemd[1]: Reloading.
Jan 23 09:45:01 compute-1 systemd-rc-local-generator[62773]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:45:01 compute-1 systemd-sysv-generator[62777]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:45:01 compute-1 systemd[1]: Stopping IPv4 firewall with iptables...
Jan 23 09:45:02 compute-1 iptables.init[62783]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Jan 23 09:45:02 compute-1 iptables.init[62783]: iptables: Flushing firewall rules: [  OK  ]
Jan 23 09:45:02 compute-1 systemd[1]: iptables.service: Deactivated successfully.
Jan 23 09:45:02 compute-1 systemd[1]: Stopped IPv4 firewall with iptables.
Jan 23 09:45:02 compute-1 sudo[62737]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:02 compute-1 sudo[62978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fvkjoblrohggqgjkhgbthvwkvbipursd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161502.3081222-583-238244041814725/AnsiballZ_systemd.py'
Jan 23 09:45:02 compute-1 sudo[62978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:02 compute-1 python3.9[62980]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:45:02 compute-1 sudo[62978]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:03 compute-1 sudo[63132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmeppnpcynleqloxutcvpytcjuusnanw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161503.314022-631-268136647105440/AnsiballZ_systemd.py'
Jan 23 09:45:03 compute-1 sudo[63132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:03 compute-1 python3.9[63134]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:45:04 compute-1 systemd[1]: Reloading.
Jan 23 09:45:04 compute-1 systemd-rc-local-generator[63158]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:45:04 compute-1 systemd-sysv-generator[63162]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:45:04 compute-1 systemd[1]: Starting Netfilter Tables...
Jan 23 09:45:04 compute-1 systemd[1]: Finished Netfilter Tables.
Jan 23 09:45:04 compute-1 sudo[63132]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:05 compute-1 sudo[63324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hobvchitdxgyfdcmcuzsjvishzvyzqwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161504.575341-655-144145480081402/AnsiballZ_command.py'
Jan 23 09:45:05 compute-1 sudo[63324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:05 compute-1 python3.9[63326]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:05 compute-1 sudo[63324]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:06 compute-1 sudo[63477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwmsrbxphyaazoedwjlgwrqbddpnomwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161506.1696353-697-261973249454088/AnsiballZ_stat.py'
Jan 23 09:45:06 compute-1 sudo[63477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:06 compute-1 python3.9[63479]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:06 compute-1 sudo[63477]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:07 compute-1 sudo[63602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzrlvjkklqllrqwjujjygziazjcielqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161506.1696353-697-261973249454088/AnsiballZ_copy.py'
Jan 23 09:45:07 compute-1 sudo[63602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:07 compute-1 python3.9[63604]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161506.1696353-697-261973249454088/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:07 compute-1 sudo[63602]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:07 compute-1 sudo[63755]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mumobdjycrbqpiazeblscrugocrpdsot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161507.4500515-742-272147236254793/AnsiballZ_systemd.py'
Jan 23 09:45:07 compute-1 sudo[63755]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:07 compute-1 python3.9[63757]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:45:08 compute-1 systemd[1]: Reloading OpenSSH server daemon...
Jan 23 09:45:08 compute-1 sshd[1007]: Received SIGHUP; restarting.
Jan 23 09:45:08 compute-1 sshd[1007]: Server listening on 0.0.0.0 port 22.
Jan 23 09:45:08 compute-1 sshd[1007]: Server listening on :: port 22.
Jan 23 09:45:08 compute-1 systemd[1]: Reloaded OpenSSH server daemon.
Jan 23 09:45:08 compute-1 sudo[63755]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:08 compute-1 sudo[63911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icmmumgmzvszrlqylfnbjeaptlwbwqwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161508.3111563-766-69989599948882/AnsiballZ_file.py'
Jan 23 09:45:08 compute-1 sudo[63911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:08 compute-1 python3.9[63913]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:08 compute-1 sudo[63911]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:09 compute-1 sudo[64063]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qccytcolzmbdjjdiwrfljxochztwxnpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161509.0550687-790-90975495699921/AnsiballZ_stat.py'
Jan 23 09:45:09 compute-1 sudo[64063]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:09 compute-1 python3.9[64065]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:09 compute-1 sudo[64063]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:09 compute-1 sudo[64186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrtrvkuoouagmfewywqxkrsjzgttevcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161509.0550687-790-90975495699921/AnsiballZ_copy.py'
Jan 23 09:45:09 compute-1 sudo[64186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:10 compute-1 python3.9[64188]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161509.0550687-790-90975495699921/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:10 compute-1 sudo[64186]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:10 compute-1 sudo[64338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huedutahzlchhjbowjjcmswsafjkssqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161510.544356-844-23952132552369/AnsiballZ_timezone.py'
Jan 23 09:45:10 compute-1 sudo[64338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:11 compute-1 python3.9[64340]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 09:45:11 compute-1 systemd[1]: Starting Time & Date Service...
Jan 23 09:45:11 compute-1 systemd[1]: Started Time & Date Service.
Jan 23 09:45:11 compute-1 sudo[64338]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:11 compute-1 sudo[64494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhyfrtvwtlazkkimvlnektqhnalpviib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161511.55174-871-245050639708188/AnsiballZ_file.py'
Jan 23 09:45:11 compute-1 sudo[64494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:11 compute-1 python3.9[64496]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:12 compute-1 sudo[64494]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:13 compute-1 sudo[64646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-csdcjcaotwuwhcrwkongsrqcehonaxqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161513.1783187-895-187378640113573/AnsiballZ_stat.py'
Jan 23 09:45:13 compute-1 sudo[64646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:13 compute-1 python3.9[64648]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:13 compute-1 sudo[64646]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:14 compute-1 sudo[64769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kwlpvgfrudpfceqiotozccmggptcehbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161513.1783187-895-187378640113573/AnsiballZ_copy.py'
Jan 23 09:45:14 compute-1 sudo[64769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:14 compute-1 python3.9[64771]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161513.1783187-895-187378640113573/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:14 compute-1 sudo[64769]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:14 compute-1 sudo[64921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jazezvozqktdwkervkugufiqfjpeyghp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161514.3688216-940-124432934527201/AnsiballZ_stat.py'
Jan 23 09:45:14 compute-1 sudo[64921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:14 compute-1 python3.9[64923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:14 compute-1 sudo[64921]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:15 compute-1 sudo[65044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qeatgdhfchyxakhldekhfftmovabwpvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161514.3688216-940-124432934527201/AnsiballZ_copy.py'
Jan 23 09:45:15 compute-1 sudo[65044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:15 compute-1 python3.9[65046]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769161514.3688216-940-124432934527201/.source.yaml _original_basename=.g4g35s3q follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:15 compute-1 sudo[65044]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:15 compute-1 sudo[65196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fgvavatelgcpdxaaskguvzvyujoxqujf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161515.5931053-985-206815604767094/AnsiballZ_stat.py'
Jan 23 09:45:15 compute-1 sudo[65196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:16 compute-1 python3.9[65198]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:16 compute-1 sudo[65196]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:16 compute-1 sudo[65319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cbaoqgpqauyaotumiywgikjkwqrfhoce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161515.5931053-985-206815604767094/AnsiballZ_copy.py'
Jan 23 09:45:16 compute-1 sudo[65319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:16 compute-1 python3.9[65321]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161515.5931053-985-206815604767094/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:16 compute-1 sudo[65319]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:17 compute-1 sudo[65471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtepruxhmawcaursnnxhlrdjomocziik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161517.1223845-1030-41545478912163/AnsiballZ_command.py'
Jan 23 09:45:17 compute-1 sudo[65471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:17 compute-1 python3.9[65473]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:17 compute-1 sudo[65471]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:18 compute-1 sudo[65624]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urqigijwjnvgfxdostqidxmhwlljuczw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161517.9190257-1054-265191096999561/AnsiballZ_command.py'
Jan 23 09:45:18 compute-1 sudo[65624]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:18 compute-1 python3.9[65626]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:18 compute-1 sudo[65624]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:19 compute-1 sudo[65777]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uuoqjlxyczacztpefyhbcpljtfafpfqt ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769161518.6099234-1078-234529297555646/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 09:45:19 compute-1 sudo[65777]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:19 compute-1 python3[65779]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 09:45:19 compute-1 sudo[65777]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:19 compute-1 sudo[65929]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cihkvtzbytquttpazqbaadqdcyuykrfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161519.4919567-1102-115947755542535/AnsiballZ_stat.py'
Jan 23 09:45:19 compute-1 sudo[65929]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:19 compute-1 python3.9[65931]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:19 compute-1 sudo[65929]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:20 compute-1 sudo[66052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwkjwakdzywtlewkjjsiuherspcwzupz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161519.4919567-1102-115947755542535/AnsiballZ_copy.py'
Jan 23 09:45:20 compute-1 sudo[66052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:20 compute-1 python3.9[66054]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161519.4919567-1102-115947755542535/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:20 compute-1 sudo[66052]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:21 compute-1 sudo[66204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rweqvxxuxgwwjvvqamemuxxlocausopc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161520.7631946-1147-30328738945675/AnsiballZ_stat.py'
Jan 23 09:45:21 compute-1 sudo[66204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:21 compute-1 python3.9[66206]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:21 compute-1 sudo[66204]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:21 compute-1 sudo[66327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apbgmgphqvsfyfjbcqwclzuouzeelnmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161520.7631946-1147-30328738945675/AnsiballZ_copy.py'
Jan 23 09:45:21 compute-1 sudo[66327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:21 compute-1 python3.9[66329]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161520.7631946-1147-30328738945675/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:21 compute-1 sudo[66327]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:22 compute-1 sudo[66479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oiyzfntwppqvwedyaztayqbhfkpnazkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161522.0188193-1192-129481673749233/AnsiballZ_stat.py'
Jan 23 09:45:22 compute-1 sudo[66479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:22 compute-1 python3.9[66481]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:22 compute-1 sudo[66479]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:22 compute-1 sudo[66602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ytcyrhzmcvikcrxiddzwwapifcsloqog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161522.0188193-1192-129481673749233/AnsiballZ_copy.py'
Jan 23 09:45:22 compute-1 sudo[66602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:23 compute-1 python3.9[66604]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161522.0188193-1192-129481673749233/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:23 compute-1 sudo[66602]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:23 compute-1 sudo[66754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cdzbhnmbodolqkedapevvrzamxolvjbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161523.321505-1237-4778494083511/AnsiballZ_stat.py'
Jan 23 09:45:23 compute-1 sudo[66754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:23 compute-1 python3.9[66756]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:23 compute-1 sudo[66754]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:24 compute-1 sudo[66877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-curtiyuchrzwcfaeapiycknbbkxhnpzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161523.321505-1237-4778494083511/AnsiballZ_copy.py'
Jan 23 09:45:24 compute-1 sudo[66877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:24 compute-1 python3.9[66879]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161523.321505-1237-4778494083511/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:24 compute-1 sudo[66877]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:25 compute-1 sudo[67029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eiyodowxtvqyngyikzdskbsyyjaiiljw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161524.5952666-1282-75952755557257/AnsiballZ_stat.py'
Jan 23 09:45:25 compute-1 sudo[67029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:25 compute-1 python3.9[67031]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:45:25 compute-1 sudo[67029]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:25 compute-1 sudo[67152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxdadisggmqwzmojfsjkhuyttxqqocae ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161524.5952666-1282-75952755557257/AnsiballZ_copy.py'
Jan 23 09:45:25 compute-1 sudo[67152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:25 compute-1 python3.9[67154]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769161524.5952666-1282-75952755557257/.source.nft follow=False _original_basename=ruleset.j2 checksum=693377dc03e5b6b24713cb537b18b88774724e35 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:25 compute-1 sudo[67152]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:26 compute-1 sudo[67304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yerabxhynoanoovualitfbxjuqvscyvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161525.950579-1327-230905717159027/AnsiballZ_file.py'
Jan 23 09:45:26 compute-1 sudo[67304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:26 compute-1 python3.9[67306]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:26 compute-1 sudo[67304]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:26 compute-1 sudo[67456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pegncenfrxuahmaxlayxjvhobvjcmofe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161526.6712992-1352-146653770357131/AnsiballZ_command.py'
Jan 23 09:45:26 compute-1 sudo[67456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:27 compute-1 python3.9[67458]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:27 compute-1 sudo[67456]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:27 compute-1 sudo[67615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lycobpasmwcbruwobxglhmxenswcuzvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161527.468319-1375-207740655420521/AnsiballZ_blockinfile.py'
Jan 23 09:45:27 compute-1 sudo[67615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:28 compute-1 python3.9[67617]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                            include "/etc/nftables/edpm-chains.nft"
                                            include "/etc/nftables/edpm-rules.nft"
                                            include "/etc/nftables/edpm-jumps.nft"
                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:28 compute-1 sudo[67615]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:28 compute-1 sudo[67768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnbjtuaxzveziholshvgvpxohrisjhyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161528.4114597-1402-248540427611450/AnsiballZ_file.py'
Jan 23 09:45:28 compute-1 sudo[67768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:28 compute-1 python3.9[67770]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:28 compute-1 sudo[67768]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:29 compute-1 sudo[67920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbshtdelodprznsmnlobxsdsqouehmkz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161529.0610857-1402-141984188762414/AnsiballZ_file.py'
Jan 23 09:45:29 compute-1 sudo[67920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:29 compute-1 python3.9[67922]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:29 compute-1 sudo[67920]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:30 compute-1 sudo[68072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjajqxkmazjbmkuqjxolkxlccfejuxqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161529.9735086-1447-69233835153376/AnsiballZ_mount.py'
Jan 23 09:45:30 compute-1 sudo[68072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:30 compute-1 python3.9[68074]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 09:45:30 compute-1 sudo[68072]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:31 compute-1 sudo[68225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqlenyojizqqkqwdtgbexfmvamejupow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161530.9101796-1447-15872880383807/AnsiballZ_mount.py'
Jan 23 09:45:31 compute-1 sudo[68225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:31 compute-1 python3.9[68227]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 09:45:31 compute-1 sudo[68225]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:31 compute-1 sshd-session[59018]: Connection closed by 192.168.122.30 port 45254
Jan 23 09:45:31 compute-1 sshd-session[59015]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:45:31 compute-1 systemd[1]: session-15.scope: Deactivated successfully.
Jan 23 09:45:31 compute-1 systemd[1]: session-15.scope: Consumed 36.479s CPU time.
Jan 23 09:45:31 compute-1 systemd-logind[807]: Session 15 logged out. Waiting for processes to exit.
Jan 23 09:45:31 compute-1 systemd-logind[807]: Removed session 15.
Jan 23 09:45:37 compute-1 sshd-session[68253]: Accepted publickey for zuul from 192.168.122.30 port 60472 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:45:37 compute-1 systemd-logind[807]: New session 16 of user zuul.
Jan 23 09:45:37 compute-1 systemd[1]: Started Session 16 of User zuul.
Jan 23 09:45:37 compute-1 sshd-session[68253]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:45:38 compute-1 sudo[68406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyglmnswvpoygkrzabwtcvylahxqqprx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161537.6024764-19-252795746251259/AnsiballZ_tempfile.py'
Jan 23 09:45:38 compute-1 sudo[68406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:38 compute-1 python3.9[68408]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 09:45:38 compute-1 sudo[68406]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:38 compute-1 sudo[68558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyxldrprlenazaoshzokinszxllgbnyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161538.4174206-55-7810426439896/AnsiballZ_stat.py'
Jan 23 09:45:38 compute-1 sudo[68558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:38 compute-1 python3.9[68560]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:45:38 compute-1 sudo[68558]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:39 compute-1 sudo[68710]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncfjnxogwdbjaypoejsxaarmgnsjesjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161539.3760016-85-222211332112790/AnsiballZ_setup.py'
Jan 23 09:45:39 compute-1 sudo[68710]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:40 compute-1 python3.9[68712]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:45:40 compute-1 sudo[68710]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:41 compute-1 sudo[68862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-senaefurbrwclgylgjxldusdvfdewfro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161540.59557-110-101659942795306/AnsiballZ_blockinfile.py'
Jan 23 09:45:41 compute-1 sudo[68862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:41 compute-1 python3.9[68864]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+cj2so8SS29oYZ1K+7e02qi6fVkGXJzGMkIN9mgJPLCBtQ6vpBYEObTZZXuMIHhdiMUAp6RDjs11OXDkAB9R7e2ncjMKn7J2EHbmceT7rNq9L0w+QaLKFxl+xdJQ9QtO9ioNgJFXXQZt/IOeE8S4I5yhEM5jn+YEW0LPbp99Wz1d1Ob4GI1t0hCEv/4ayC3nRIXkuIhl7mrV0s22F8NE8f0hZZKaw1u8xmmpbD8ZVBsC6cxWE3kIQBmHu8q9tylaZjLsjGxBDUF9ko3bxeppvLPDMem89VLQCWbgmOHl5ZIPsyNglusTIBUp8uA7g+Agz1uMojClMHnsZl68WjbCAVcRA9y/UgXphGyEYZCUJMv8CjYKzxriyHALZl6YFSyC5ELlEAxL8fyTwtXhQ1+e/lI9Ak3n4suC6JyH0NQ27MPIf7riyUFJLw9lZaDerZOkvI7/Y2PfRvdfyZ57g/xgGeLY0Ch30SFVC04lNXIpsOWbLBOg0BMP9ZiciAYAF9Yc=
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIreWuVcekgp7kF5pU+4TIKLHZyhuqd4Ly312ExEA5EG
                                            compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJWfXOTsTXqDhdGhW7VcUXsYqCS7TzCPyaa9/dA9e0xKjnni1/GRM8FdYXWYbGsNnBQFWk3/pXD6sj3jKzK34AM=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWbrXZxuAw0n/xJmOvWW/Qbg53ya2CuJKzcHA+OvDpHLHGxkEuiUhwKvqUbfSTzn0o1M00OYITJIvZVINGRtQC7hGvBPWLVBON097mcmnju857I72U3dGdvGhnEUHyrglCV+xSkafQTTlnY9B59EKImUs/kiwRy3cYDWkCgthJgiPA4QSw6WrzaqpY2ET+7n+yY31EOagGA3ufW43qFbHX4diFuXpS1I1PLvvA4KINlMlsFcyR29j4nQk/vb5hMpLmBOlfVH16CXZC98a0ltp9ib7F3e1Wjdogj92kxwfQMYIeQEBp11Tc/PY5U90J51oyk8xYOKfsP3+r9yczmfRDjwR3+tzUMKyZYAsKQVcOGQC7x9sEXg3mBeXRVrlIVZFMuNVcYq4CY40fDIybcI25GxgRbQR7ZUWODG1SL7RF02Z+LQB6APXkzxdQUWLWPryj/EtOgnHQ1I0+BJTWrqGkKbSj41jhRTfS+MZvRXAJ+fNyZFhpkHo54DrCii4cbyM=
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGRPkwTcFVg/dIKRq29iWBfkoVFqIQ1pXOCPxfcGWRFF
                                            compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGf/hJ2dg/PRwojw63FLyKqua+ChKP+2bc7Eb0p70H6ve1elFVeY8lVRXx33JWc2m/XfgSWPNcUs9zBG8QcFVak=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA/6JnQZ3CFC7xgv4DrvdZizVbVnsolKcWkvqzGu1hFHGmOEb7ehbxGPHBnp2N9iRf13H12EI0qNI6A2f44V0oXE3SP+fpJ6PVYQRQpKqTEiweqZaHEyYE2FnKy0HDQisg5hwr1egYLjGXChdkyqWSokL1LqaCyD2+EcOzUvC/GuVQ7eQnQBIGBpYAnNzS/64KKOZ0+0soOPJGxVCma6JN/2GcCunX6j3HmkOOQeuEFETXfUPHh1ylu2+3yINl34ERJN5YwgR/S+BKENOsJTu5XkYTCvc90CuvfkoF9K5Y2yE5nKwZaSf7n2SbUPil2Zph4l7opsd5IKxi6k2mVzw/CO2NHr136BZ06+sKXytDgorWqWzqnci8zfxeYF3D7q7AXD+IDVMP5T6op93oS2enAQFHG1vTLB0otQqnxUgNANbJkrKgXAS8G8I1m2sPz+qOFuuZa2/nqhzrd6/DEur5VoW6n9c/OcrbfapLEzD1jQDmsQI7oZkT++dt3Ogb3Vk=
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIII1sLqY7Nqi1A3CKXLokfn1vrns/lK1gUkDNSlbek2o
                                            compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9QZXHUsthFMKA5Si4Htl7MIwK0G4VAltQgbo39JJHrgD7h27U1jbnuJQ1S2bBX8FMSkqf5TPmM7Gr9QOATO+4=
                                             create=True mode=0644 path=/tmp/ansible.ncrgoy7m state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:41 compute-1 sudo[68862]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:41 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 09:45:41 compute-1 sudo[69016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qrfgqyqrqzrtdktkzofguowhmcgdwiua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161541.4342046-134-242369181841210/AnsiballZ_command.py'
Jan 23 09:45:41 compute-1 sudo[69016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:42 compute-1 python3.9[69018]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.ncrgoy7m' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:42 compute-1 sudo[69016]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:42 compute-1 sudo[69170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kocpcglhnsjarxwbcbxivvoxgbmtcbli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161542.2790592-158-94831262626807/AnsiballZ_file.py'
Jan 23 09:45:42 compute-1 sudo[69170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:42 compute-1 python3.9[69172]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.ncrgoy7m state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:42 compute-1 sudo[69170]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:43 compute-1 sshd-session[68256]: Connection closed by 192.168.122.30 port 60472
Jan 23 09:45:43 compute-1 sshd-session[68253]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:45:43 compute-1 systemd[1]: session-16.scope: Deactivated successfully.
Jan 23 09:45:43 compute-1 systemd[1]: session-16.scope: Consumed 3.392s CPU time.
Jan 23 09:45:43 compute-1 systemd-logind[807]: Session 16 logged out. Waiting for processes to exit.
Jan 23 09:45:43 compute-1 systemd-logind[807]: Removed session 16.
Jan 23 09:45:49 compute-1 sshd-session[69197]: Accepted publickey for zuul from 192.168.122.30 port 44652 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:45:49 compute-1 systemd-logind[807]: New session 17 of user zuul.
Jan 23 09:45:49 compute-1 systemd[1]: Started Session 17 of User zuul.
Jan 23 09:45:49 compute-1 sshd-session[69197]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:45:50 compute-1 python3.9[69350]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:45:51 compute-1 sudo[69504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxyfyysgzqzpowtktssinwnvgztfhrbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161551.0525572-52-6987810876520/AnsiballZ_systemd.py'
Jan 23 09:45:51 compute-1 sudo[69504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:52 compute-1 python3.9[69506]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 09:45:52 compute-1 sudo[69504]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:52 compute-1 sudo[69658]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uiotzaiewgtrmjavzmchsxmmpanqgqsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161552.3926055-76-213245573362442/AnsiballZ_systemd.py'
Jan 23 09:45:52 compute-1 sudo[69658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:52 compute-1 python3.9[69660]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:45:53 compute-1 sudo[69658]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:53 compute-1 sudo[69811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfakeficnrdmrnnxfjyoyvcwncheszpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161553.2835784-103-25887936725218/AnsiballZ_command.py'
Jan 23 09:45:53 compute-1 sudo[69811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:53 compute-1 python3.9[69813]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:53 compute-1 sudo[69811]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:54 compute-1 sudo[69964]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjueixqhvnfvbvvlfwvaudveygzoeosg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161554.2587197-127-230018967782649/AnsiballZ_stat.py'
Jan 23 09:45:54 compute-1 sudo[69964]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:54 compute-1 python3.9[69966]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:45:54 compute-1 sudo[69964]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:55 compute-1 sudo[70118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sanegttxjlkiyzdznmcnrnmitsyrarqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161555.0582535-151-195947930863447/AnsiballZ_command.py'
Jan 23 09:45:55 compute-1 sudo[70118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:55 compute-1 python3.9[70120]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:45:55 compute-1 sudo[70118]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:56 compute-1 sudo[70273]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uqbvmzoqllgcoevyydipgdjymyoewcgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161555.845806-175-200109400742757/AnsiballZ_file.py'
Jan 23 09:45:56 compute-1 sudo[70273]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:45:56 compute-1 python3.9[70275]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:45:56 compute-1 sudo[70273]: pam_unix(sudo:session): session closed for user root
Jan 23 09:45:57 compute-1 sshd-session[69200]: Connection closed by 192.168.122.30 port 44652
Jan 23 09:45:57 compute-1 sshd-session[69197]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:45:57 compute-1 systemd[1]: session-17.scope: Deactivated successfully.
Jan 23 09:45:57 compute-1 systemd[1]: session-17.scope: Consumed 4.521s CPU time.
Jan 23 09:45:57 compute-1 systemd-logind[807]: Session 17 logged out. Waiting for processes to exit.
Jan 23 09:45:57 compute-1 systemd-logind[807]: Removed session 17.
Jan 23 09:46:02 compute-1 sshd-session[70300]: Accepted publickey for zuul from 192.168.122.30 port 41980 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:46:02 compute-1 systemd-logind[807]: New session 18 of user zuul.
Jan 23 09:46:02 compute-1 systemd[1]: Started Session 18 of User zuul.
Jan 23 09:46:02 compute-1 sshd-session[70300]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:46:03 compute-1 python3.9[70453]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:46:04 compute-1 sudo[70607]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbtwiqxrtoobrmfkopfmkofwizdqtwfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161564.1264234-58-102204871801922/AnsiballZ_setup.py'
Jan 23 09:46:04 compute-1 sudo[70607]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:04 compute-1 python3.9[70609]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:46:05 compute-1 sudo[70607]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:05 compute-1 sudo[70691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nezdrrclzekxvuacknikapoiuoytlmed ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769161564.1264234-58-102204871801922/AnsiballZ_dnf.py'
Jan 23 09:46:05 compute-1 sudo[70691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:05 compute-1 python3.9[70693]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 09:46:07 compute-1 sudo[70691]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:08 compute-1 python3.9[70844]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:46:10 compute-1 python3.9[70995]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:46:11 compute-1 python3.9[71145]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:46:11 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:46:11 compute-1 python3.9[71297]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:46:12 compute-1 sshd-session[71291]: Invalid user solana from 45.148.10.240 port 60180
Jan 23 09:46:12 compute-1 sshd-session[70303]: Connection closed by 192.168.122.30 port 41980
Jan 23 09:46:12 compute-1 sshd-session[70300]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:46:12 compute-1 systemd[1]: session-18.scope: Deactivated successfully.
Jan 23 09:46:12 compute-1 systemd[1]: session-18.scope: Consumed 6.367s CPU time.
Jan 23 09:46:12 compute-1 systemd-logind[807]: Session 18 logged out. Waiting for processes to exit.
Jan 23 09:46:12 compute-1 systemd-logind[807]: Removed session 18.
Jan 23 09:46:12 compute-1 sshd-session[71291]: Connection closed by invalid user solana 45.148.10.240 port 60180 [preauth]
Jan 23 09:46:20 compute-1 sshd-session[71323]: Accepted publickey for zuul from 38.129.56.17 port 49808 ssh2: RSA SHA256:/TrmfiPCpRhp7iDH6L+XY56Icv2RRStSYrCVh8OnXTQ
Jan 23 09:46:20 compute-1 systemd-logind[807]: New session 19 of user zuul.
Jan 23 09:46:20 compute-1 systemd[1]: Started Session 19 of User zuul.
Jan 23 09:46:20 compute-1 sshd-session[71323]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:46:21 compute-1 sudo[71399]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jnryeizfcphgxobplgoqkrmxuhbavsqi ; /usr/bin/python3'
Jan 23 09:46:21 compute-1 sudo[71399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:21 compute-1 useradd[71403]: new group: name=ceph-admin, GID=42478
Jan 23 09:46:21 compute-1 useradd[71403]: new user: name=ceph-admin, UID=42477, GID=42478, home=/home/ceph-admin, shell=/bin/bash, from=none
Jan 23 09:46:21 compute-1 sudo[71399]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:21 compute-1 sudo[71485]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pvvdskspykgrgbzvwfeffrqsdaktdzgr ; /usr/bin/python3'
Jan 23 09:46:21 compute-1 sudo[71485]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:21 compute-1 sudo[71485]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:22 compute-1 sudo[71558]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fjgolfmyveucodpnybpsbjzqqsiwgjor ; /usr/bin/python3'
Jan 23 09:46:22 compute-1 sudo[71558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:22 compute-1 sudo[71558]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:22 compute-1 sudo[71608]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-imtasjjuenvhjtzakiomrgxmcpwfvmck ; /usr/bin/python3'
Jan 23 09:46:22 compute-1 sudo[71608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:23 compute-1 sudo[71608]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:23 compute-1 sudo[71634]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hbxglgzmolbbnixiajbttzheniwhxcgs ; /usr/bin/python3'
Jan 23 09:46:23 compute-1 sudo[71634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:23 compute-1 sudo[71634]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:23 compute-1 sudo[71660]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-czsmqwtmwnbrzofhbuyqrtukhuhbqmne ; /usr/bin/python3'
Jan 23 09:46:23 compute-1 sudo[71660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:23 compute-1 sudo[71660]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:24 compute-1 sudo[71686]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utksgvgaxwkobnyeclrwipkfshwojvzz ; /usr/bin/python3'
Jan 23 09:46:24 compute-1 sudo[71686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:24 compute-1 sudo[71686]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:24 compute-1 sudo[71764]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-idelsvraxtdwhniunprsrppnxzmgqifu ; /usr/bin/python3'
Jan 23 09:46:24 compute-1 sudo[71764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:24 compute-1 sudo[71764]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:25 compute-1 sudo[71837]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fkslnwsiceykbrvkxsevuivsqlejvylm ; /usr/bin/python3'
Jan 23 09:46:25 compute-1 sudo[71837]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:25 compute-1 sudo[71837]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:25 compute-1 sudo[71939]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afuexprvdqamddtmwjkqzkrjalzjxehc ; /usr/bin/python3'
Jan 23 09:46:25 compute-1 sudo[71939]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:25 compute-1 sudo[71939]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:25 compute-1 sudo[72012]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iocyghewsyfevkwfhspuhtnuqploupez ; /usr/bin/python3'
Jan 23 09:46:25 compute-1 sudo[72012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:26 compute-1 sudo[72012]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:26 compute-1 sudo[72062]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vtisyzxqfkukevtfmwxewtvkavzgewbk ; /usr/bin/python3'
Jan 23 09:46:26 compute-1 sudo[72062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:26 compute-1 python3[72064]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:46:27 compute-1 sudo[72062]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:28 compute-1 sudo[72157]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqctffoqczmlshujqhgbbtbtxkiplobw ; /usr/bin/python3'
Jan 23 09:46:28 compute-1 sudo[72157]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:28 compute-1 python3[72159]: ansible-ansible.legacy.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Jan 23 09:46:29 compute-1 sudo[72157]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:30 compute-1 sudo[72184]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxaywutazbioiebklnqbrulddctnziuo ; /usr/bin/python3'
Jan 23 09:46:30 compute-1 sudo[72184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:30 compute-1 python3[72186]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Jan 23 09:46:30 compute-1 sudo[72184]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:30 compute-1 sudo[72210]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odymnafqmzkaxzknlehryyhcebnymuad ; /usr/bin/python3'
Jan 23 09:46:30 compute-1 sudo[72210]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:30 compute-1 python3[72212]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=20G
                                          losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                          lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:46:30 compute-1 kernel: loop: module loaded
Jan 23 09:46:30 compute-1 kernel: loop3: detected capacity change from 0 to 41943040
Jan 23 09:46:30 compute-1 sudo[72210]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:31 compute-1 sudo[72246]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tmekmympqhgttybhyvdrwaehxyhspkju ; /usr/bin/python3'
Jan 23 09:46:31 compute-1 sudo[72246]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:31 compute-1 python3[72248]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                          vgcreate ceph_vg0 /dev/loop3
                                          lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                          lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:46:31 compute-1 lvm[72251]: PV /dev/loop3 not used.
Jan 23 09:46:31 compute-1 lvm[72260]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:46:31 compute-1 sudo[72246]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:31 compute-1 systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Jan 23 09:46:31 compute-1 lvm[72262]:   1 logical volume(s) in volume group "ceph_vg0" now active
Jan 23 09:46:31 compute-1 systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Jan 23 09:46:32 compute-1 sudo[72338]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnvpouysqmbaeevukgzkjunrgbwrmwac ; /usr/bin/python3'
Jan 23 09:46:32 compute-1 sudo[72338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:32 compute-1 python3[72340]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Jan 23 09:46:32 compute-1 sudo[72338]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:32 compute-1 chronyd[58533]: Selected source 54.39.23.64 (pool.ntp.org)
Jan 23 09:46:32 compute-1 sudo[72411]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxiqcqxhxhxtmomaincbkrcywimjwmzo ; /usr/bin/python3'
Jan 23 09:46:32 compute-1 sudo[72411]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:32 compute-1 python3[72413]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769161591.8796751-37004-63740275737266/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:46:32 compute-1 sudo[72411]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:33 compute-1 sudo[72461]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kxjracfjmabqtcemxbkecqtilxkmfhhj ; /usr/bin/python3'
Jan 23 09:46:33 compute-1 sudo[72461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:46:33 compute-1 python3[72463]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:46:33 compute-1 systemd[1]: Reloading.
Jan 23 09:46:33 compute-1 systemd-rc-local-generator[72488]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:46:33 compute-1 systemd-sysv-generator[72494]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:46:33 compute-1 systemd[1]: Starting Ceph OSD losetup...
Jan 23 09:46:33 compute-1 bash[72504]: /dev/loop3: [64513]:4328449 (/var/lib/ceph-osd-0.img)
Jan 23 09:46:33 compute-1 systemd[1]: Finished Ceph OSD losetup.
Jan 23 09:46:33 compute-1 sudo[72461]: pam_unix(sudo:session): session closed for user root
Jan 23 09:46:33 compute-1 lvm[72505]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:46:33 compute-1 lvm[72505]: VG ceph_vg0 finished
Jan 23 09:46:36 compute-1 python3[72529]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:48:26 compute-1 sshd-session[72573]: Invalid user sol from 45.148.10.240 port 46266
Jan 23 09:48:26 compute-1 sshd-session[72573]: Connection closed by invalid user sol 45.148.10.240 port 46266 [preauth]
Jan 23 09:48:43 compute-1 sshd-session[72575]: Accepted publickey for ceph-admin from 192.168.122.100 port 58996 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:43 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 09:48:43 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 09:48:43 compute-1 systemd-logind[807]: New session 20 of user ceph-admin.
Jan 23 09:48:43 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 09:48:43 compute-1 systemd[1]: Starting User Manager for UID 42477...
Jan 23 09:48:43 compute-1 systemd[72579]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:43 compute-1 systemd[72579]: Queued start job for default target Main User Target.
Jan 23 09:48:43 compute-1 systemd[72579]: Created slice User Application Slice.
Jan 23 09:48:43 compute-1 systemd[72579]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:48:43 compute-1 systemd[72579]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:48:43 compute-1 systemd[72579]: Reached target Paths.
Jan 23 09:48:43 compute-1 systemd[72579]: Reached target Timers.
Jan 23 09:48:43 compute-1 systemd[72579]: Starting D-Bus User Message Bus Socket...
Jan 23 09:48:43 compute-1 systemd[72579]: Starting Create User's Volatile Files and Directories...
Jan 23 09:48:43 compute-1 systemd[72579]: Finished Create User's Volatile Files and Directories.
Jan 23 09:48:43 compute-1 systemd[72579]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:48:43 compute-1 systemd[72579]: Reached target Sockets.
Jan 23 09:48:43 compute-1 systemd[72579]: Reached target Basic System.
Jan 23 09:48:43 compute-1 systemd[1]: Started User Manager for UID 42477.
Jan 23 09:48:43 compute-1 systemd[72579]: Reached target Main User Target.
Jan 23 09:48:43 compute-1 systemd[72579]: Startup finished in 121ms.
Jan 23 09:48:44 compute-1 systemd[1]: Started Session 20 of User ceph-admin.
Jan 23 09:48:44 compute-1 sshd-session[72575]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:44 compute-1 sshd-session[72592]: Accepted publickey for ceph-admin from 192.168.122.100 port 59006 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:44 compute-1 systemd-logind[807]: New session 22 of user ceph-admin.
Jan 23 09:48:44 compute-1 systemd[1]: Started Session 22 of User ceph-admin.
Jan 23 09:48:44 compute-1 sshd-session[72592]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:44 compute-1 sudo[72600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:48:44 compute-1 sudo[72600]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:44 compute-1 sudo[72600]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:44 compute-1 sshd-session[72625]: Accepted publickey for ceph-admin from 192.168.122.100 port 59010 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:44 compute-1 systemd-logind[807]: New session 23 of user ceph-admin.
Jan 23 09:48:44 compute-1 systemd[1]: Started Session 23 of User ceph-admin.
Jan 23 09:48:44 compute-1 sshd-session[72625]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:44 compute-1 sudo[72629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Jan 23 09:48:44 compute-1 sudo[72629]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:44 compute-1 sudo[72629]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:44 compute-1 sshd-session[72654]: Accepted publickey for ceph-admin from 192.168.122.100 port 59012 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:44 compute-1 systemd-logind[807]: New session 24 of user ceph-admin.
Jan 23 09:48:44 compute-1 systemd[1]: Started Session 24 of User ceph-admin.
Jan 23 09:48:44 compute-1 sshd-session[72654]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:44 compute-1 sudo[72658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Jan 23 09:48:44 compute-1 sudo[72658]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:44 compute-1 sudo[72658]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:45 compute-1 sshd-session[72683]: Accepted publickey for ceph-admin from 192.168.122.100 port 59028 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:45 compute-1 systemd-logind[807]: New session 25 of user ceph-admin.
Jan 23 09:48:45 compute-1 systemd[1]: Started Session 25 of User ceph-admin.
Jan 23 09:48:45 compute-1 sshd-session[72683]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:45 compute-1 sudo[72687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:48:45 compute-1 sudo[72687]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:45 compute-1 sudo[72687]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:45 compute-1 sshd-session[72712]: Accepted publickey for ceph-admin from 192.168.122.100 port 59038 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:45 compute-1 systemd-logind[807]: New session 26 of user ceph-admin.
Jan 23 09:48:45 compute-1 systemd[1]: Started Session 26 of User ceph-admin.
Jan 23 09:48:45 compute-1 sshd-session[72712]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:45 compute-1 sudo[72716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:48:45 compute-1 sudo[72716]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:45 compute-1 sudo[72716]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:45 compute-1 sshd-session[72741]: Accepted publickey for ceph-admin from 192.168.122.100 port 59042 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:45 compute-1 systemd-logind[807]: New session 27 of user ceph-admin.
Jan 23 09:48:45 compute-1 systemd[1]: Started Session 27 of User ceph-admin.
Jan 23 09:48:45 compute-1 sshd-session[72741]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:45 compute-1 sudo[72745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Jan 23 09:48:45 compute-1 sudo[72745]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:45 compute-1 sudo[72745]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:46 compute-1 sshd-session[72770]: Accepted publickey for ceph-admin from 192.168.122.100 port 59054 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:46 compute-1 systemd-logind[807]: New session 28 of user ceph-admin.
Jan 23 09:48:46 compute-1 systemd[1]: Started Session 28 of User ceph-admin.
Jan 23 09:48:46 compute-1 sshd-session[72770]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:46 compute-1 sudo[72774]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:48:46 compute-1 sudo[72774]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:46 compute-1 sudo[72774]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:46 compute-1 sshd-session[72799]: Accepted publickey for ceph-admin from 192.168.122.100 port 59064 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:46 compute-1 systemd-logind[807]: New session 29 of user ceph-admin.
Jan 23 09:48:46 compute-1 systemd[1]: Started Session 29 of User ceph-admin.
Jan 23 09:48:46 compute-1 sshd-session[72799]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:46 compute-1 sudo[72803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new
Jan 23 09:48:46 compute-1 sudo[72803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:46 compute-1 sudo[72803]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:46 compute-1 sshd-session[72828]: Accepted publickey for ceph-admin from 192.168.122.100 port 59080 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:46 compute-1 systemd-logind[807]: New session 30 of user ceph-admin.
Jan 23 09:48:46 compute-1 systemd[1]: Started Session 30 of User ceph-admin.
Jan 23 09:48:46 compute-1 sshd-session[72828]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:48 compute-1 sshd-session[72855]: Accepted publickey for ceph-admin from 192.168.122.100 port 59090 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:48 compute-1 systemd-logind[807]: New session 31 of user ceph-admin.
Jan 23 09:48:48 compute-1 systemd[1]: Started Session 31 of User ceph-admin.
Jan 23 09:48:48 compute-1 sshd-session[72855]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:48 compute-1 sudo[72859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36
Jan 23 09:48:48 compute-1 sudo[72859]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:48 compute-1 sudo[72859]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:48 compute-1 sshd-session[72884]: Accepted publickey for ceph-admin from 192.168.122.100 port 59098 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:48:48 compute-1 systemd-logind[807]: New session 32 of user ceph-admin.
Jan 23 09:48:48 compute-1 systemd[1]: Started Session 32 of User ceph-admin.
Jan 23 09:48:48 compute-1 sshd-session[72884]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:48:48 compute-1 sudo[72888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host --expect-hostname compute-1
Jan 23 09:48:48 compute-1 sudo[72888]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:48 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:48:48 compute-1 sudo[72888]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:48 compute-1 sudo[72934]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:48:48 compute-1 sudo[72934]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:49 compute-1 sudo[72934]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:49 compute-1 sudo[72959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 09:48:49 compute-1 sudo[72959]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:49 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:48:49 compute-1 sudo[72959]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:49 compute-1 sudo[73003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:48:49 compute-1 sudo[73003]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:49 compute-1 sudo[73003]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:49 compute-1 sudo[73028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:48:49 compute-1 sudo[73028]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:49 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:48:49 compute-1 sudo[73028]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:50 compute-1 sudo[73088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:48:50 compute-1 sudo[73088]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:50 compute-1 sudo[73088]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:50 compute-1 sudo[73113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:48:50 compute-1 sudo[73113]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:50 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:48:50 compute-1 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 73148 (sysctl)
Jan 23 09:48:50 compute-1 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Jan 23 09:48:50 compute-1 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Jan 23 09:48:50 compute-1 sudo[73113]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:50 compute-1 sudo[73171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:48:50 compute-1 sudo[73171]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:50 compute-1 sudo[73171]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:50 compute-1 sudo[73196]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 09:48:50 compute-1 sudo[73196]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:51 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:48:51 compute-1 sudo[73196]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:51 compute-1 sudo[73239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:48:51 compute-1 sudo[73239]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:51 compute-1 sudo[73239]: pam_unix(sudo:session): session closed for user root
Jan 23 09:48:51 compute-1 sudo[73264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- inventory --format=json-pretty --filter-for-batch
Jan 23 09:48:51 compute-1 sudo[73264]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:48:51 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:48:54 compute-1 systemd[1]: var-lib-containers-storage-overlay-compat3883837964-lower\x2dmapped.mount: Deactivated successfully.
Jan 23 09:49:17 compute-1 podman[73325]: 2026-01-23 09:49:17.915282154 +0000 UTC m=+26.365516815 container create 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS)
Jan 23 09:49:17 compute-1 systemd[1]: var-lib-containers-storage-overlay-volatile\x2dcheck240398213-merged.mount: Deactivated successfully.
Jan 23 09:49:17 compute-1 podman[73325]: 2026-01-23 09:49:17.897979938 +0000 UTC m=+26.348214619 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:17 compute-1 systemd[1]: Created slice Virtual Machine and Container Slice.
Jan 23 09:49:17 compute-1 systemd[1]: Started libpod-conmon-49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f.scope.
Jan 23 09:49:17 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:18 compute-1 podman[73325]: 2026-01-23 09:49:18.01155549 +0000 UTC m=+26.461790251 container init 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True)
Jan 23 09:49:18 compute-1 podman[73325]: 2026-01-23 09:49:18.023111585 +0000 UTC m=+26.473346246 container start 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:49:18 compute-1 podman[73325]: 2026-01-23 09:49:18.027193423 +0000 UTC m=+26.477428144 container attach 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:49:18 compute-1 tender_kapitsa[73407]: 167 167
Jan 23 09:49:18 compute-1 systemd[1]: libpod-49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f.scope: Deactivated successfully.
Jan 23 09:49:18 compute-1 podman[73325]: 2026-01-23 09:49:18.033292965 +0000 UTC m=+26.483527666 container died 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:49:18 compute-1 systemd[1]: var-lib-containers-storage-overlay-05797ca99d44ba6745e4b9cc15b5014df2a4a0bccd229ef6043ead52cc06c67d-merged.mount: Deactivated successfully.
Jan 23 09:49:18 compute-1 podman[73325]: 2026-01-23 09:49:18.084894343 +0000 UTC m=+26.535129034 container remove 49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=tender_kapitsa, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:49:18 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:18 compute-1 systemd[1]: libpod-conmon-49e8a378bf3dbae402385d0fcec23d8ada2b3410b5cb12051fdad0156a22bc7f.scope: Deactivated successfully.
Jan 23 09:49:18 compute-1 podman[73430]: 2026-01-23 09:49:18.300359088 +0000 UTC m=+0.040482518 container create f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:49:18 compute-1 systemd[1]: Started libpod-conmon-f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4.scope.
Jan 23 09:49:18 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a9efd00be1246942a1fb7f43342804d54e64fa2dfe4cc31eee8922f095b8156/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:18 compute-1 podman[73430]: 2026-01-23 09:49:18.282101172 +0000 UTC m=+0.022224542 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:18 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2a9efd00be1246942a1fb7f43342804d54e64fa2dfe4cc31eee8922f095b8156/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:18 compute-1 podman[73430]: 2026-01-23 09:49:18.387758344 +0000 UTC m=+0.127881704 container init f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:49:18 compute-1 podman[73430]: 2026-01-23 09:49:18.395532249 +0000 UTC m=+0.135655589 container start f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 09:49:18 compute-1 podman[73430]: 2026-01-23 09:49:18.39907048 +0000 UTC m=+0.139193850 container attach f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:49:19 compute-1 fervent_yalow[73446]: [
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:     {
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         "available": false,
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         "being_replaced": false,
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         "ceph_device_lvm": false,
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         "lsm_data": {},
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         "lvs": [],
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         "path": "/dev/sr0",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         "rejected_reasons": [
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "Insufficient space (<5GB)",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "Has a FileSystem"
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         ],
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         "sys_api": {
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "actuators": null,
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "device_nodes": [
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:                 "sr0"
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             ],
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "devname": "sr0",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "human_readable_size": "482.00 KB",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "id_bus": "ata",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "model": "QEMU DVD-ROM",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "nr_requests": "2",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "parent": "/dev/sr0",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "partitions": {},
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "path": "/dev/sr0",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "removable": "1",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "rev": "2.5+",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "ro": "0",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "rotational": "1",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "sas_address": "",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "sas_device_handle": "",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "scheduler_mode": "mq-deadline",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "sectors": 0,
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "sectorsize": "2048",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "size": 493568.0,
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "support_discard": "2048",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "type": "disk",
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:             "vendor": "QEMU"
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:         }
Jan 23 09:49:19 compute-1 fervent_yalow[73446]:     }
Jan 23 09:49:19 compute-1 fervent_yalow[73446]: ]
Jan 23 09:49:19 compute-1 systemd[1]: libpod-f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4.scope: Deactivated successfully.
Jan 23 09:49:19 compute-1 podman[73430]: 2026-01-23 09:49:19.176339713 +0000 UTC m=+0.916463083 container died f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:49:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-2a9efd00be1246942a1fb7f43342804d54e64fa2dfe4cc31eee8922f095b8156-merged.mount: Deactivated successfully.
Jan 23 09:49:19 compute-1 podman[73430]: 2026-01-23 09:49:19.232545025 +0000 UTC m=+0.972668365 container remove f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=fervent_yalow, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Jan 23 09:49:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:19 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:19 compute-1 systemd[1]: libpod-conmon-f41196fbcc7a142d0f828f05411db92c27ce3a45eb84f2a2819f822db2965af4.scope: Deactivated successfully.
Jan 23 09:49:19 compute-1 sudo[73264]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:19 compute-1 sudo[74450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:49:19 compute-1 sudo[74450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:19 compute-1 sudo[74450]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:19 compute-1 sudo[74475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:49:19 compute-1 sudo[74475]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:19 compute-1 sudo[74475]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:19 compute-1 sudo[74500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:49:19 compute-1 sudo[74500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:19 compute-1 sudo[74500]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:19 compute-1 sudo[74525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:49:19 compute-1 sudo[74525]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:19 compute-1 sudo[74525]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:19 compute-1 sudo[74550]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:49:19 compute-1 sudo[74550]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:19 compute-1 sudo[74550]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:19 compute-1 sudo[74598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:49:19 compute-1 sudo[74598]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:19 compute-1 sudo[74598]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:19 compute-1 sudo[74623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:49:19 compute-1 sudo[74623]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:19 compute-1 sudo[74623]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:19 compute-1 sudo[74648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 23 09:49:19 compute-1 sudo[74648]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:19 compute-1 sudo[74648]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:49:20 compute-1 sudo[74673]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74673]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74698]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:49:20 compute-1 sudo[74698]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74698]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:49:20 compute-1 sudo[74723]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74723]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:49:20 compute-1 sudo[74748]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74748]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:49:20 compute-1 sudo[74773]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74773]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:49:20 compute-1 sudo[74821]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74821]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:49:20 compute-1 sudo[74846]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74846]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:49:20 compute-1 sudo[74871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74871]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:49:20 compute-1 sudo[74896]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74896]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:49:20 compute-1 sudo[74921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74921]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74946]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:49:20 compute-1 sudo[74946]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74946]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:49:20 compute-1 sudo[74971]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74971]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:20 compute-1 sudo[74996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:49:20 compute-1 sudo[74996]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:20 compute-1 sudo[74996]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:49:21 compute-1 sudo[75044]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75044]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:49:21 compute-1 sudo[75069]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75069]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 23 09:49:21 compute-1 sudo[75094]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75094]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:49:21 compute-1 sudo[75119]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75119]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:49:21 compute-1 sudo[75144]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75144]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:49:21 compute-1 sudo[75169]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75169]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:49:21 compute-1 sudo[75194]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75194]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:49:21 compute-1 sudo[75219]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75219]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:49:21 compute-1 sudo[75267]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75267]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:49:21 compute-1 sudo[75292]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75292]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:49:21 compute-1 sudo[75317]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75317]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:21 compute-1 sudo[75342]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:21 compute-1 sudo[75342]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:21 compute-1 sudo[75367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:49:21 compute-1 sudo[75367]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:22 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:22 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:22 compute-1 podman[75433]: 2026-01-23 09:49:22.260559577 +0000 UTC m=+0.043698469 container create bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.vendor=CentOS, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:49:22 compute-1 systemd[1]: Started libpod-conmon-bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd.scope.
Jan 23 09:49:22 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:22 compute-1 podman[75433]: 2026-01-23 09:49:22.243369494 +0000 UTC m=+0.026508406 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:22 compute-1 podman[75433]: 2026-01-23 09:49:22.340927481 +0000 UTC m=+0.124066403 container init bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Jan 23 09:49:22 compute-1 podman[75433]: 2026-01-23 09:49:22.348935054 +0000 UTC m=+0.132073946 container start bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 09:49:22 compute-1 podman[75433]: 2026-01-23 09:49:22.353175567 +0000 UTC m=+0.136314459 container attach bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 23 09:49:22 compute-1 zealous_sutherland[75450]: 167 167
Jan 23 09:49:22 compute-1 systemd[1]: libpod-bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd.scope: Deactivated successfully.
Jan 23 09:49:22 compute-1 conmon[75450]: conmon bf358994fa0eaa3ceb96 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd.scope/container/memory.events
Jan 23 09:49:22 compute-1 podman[75433]: 2026-01-23 09:49:22.357002959 +0000 UTC m=+0.140141891 container died bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 09:49:22 compute-1 podman[75433]: 2026-01-23 09:49:22.449599799 +0000 UTC m=+0.232738731 container remove bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zealous_sutherland, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 23 09:49:22 compute-1 systemd[1]: libpod-conmon-bf358994fa0eaa3ceb96434ce52bfaafbebbec597028e24352c76bd2b66306bd.scope: Deactivated successfully.
Jan 23 09:49:22 compute-1 systemd[1]: Reloading.
Jan 23 09:49:22 compute-1 systemd-rc-local-generator[75490]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:49:22 compute-1 systemd-sysv-generator[75494]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:49:22 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:22 compute-1 systemd[1]: Reloading.
Jan 23 09:49:22 compute-1 systemd-rc-local-generator[75531]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:49:22 compute-1 systemd-sysv-generator[75535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:49:23 compute-1 systemd[1]: Reached target All Ceph clusters and services.
Jan 23 09:49:23 compute-1 systemd[1]: Reloading.
Jan 23 09:49:23 compute-1 systemd-rc-local-generator[75565]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:49:23 compute-1 systemd-sysv-generator[75568]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:49:23 compute-1 systemd[1]: Reached target Ceph cluster f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:49:23 compute-1 systemd[1]: Reloading.
Jan 23 09:49:23 compute-1 systemd-rc-local-generator[75601]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:49:23 compute-1 systemd-sysv-generator[75605]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:49:23 compute-1 systemd[1]: Reloading.
Jan 23 09:49:23 compute-1 systemd-rc-local-generator[75648]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:49:23 compute-1 systemd-sysv-generator[75651]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:49:23 compute-1 systemd[1]: Created slice Slice /system/ceph-f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:49:23 compute-1 systemd[1]: Reached target System Time Set.
Jan 23 09:49:23 compute-1 systemd[1]: Reached target System Time Synchronized.
Jan 23 09:49:23 compute-1 systemd[1]: Starting Ceph crash.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:49:23 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:24 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:24 compute-1 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Jan 23 09:49:24 compute-1 podman[75702]: 2026-01-23 09:49:24.200286878 +0000 UTC m=+0.058757084 container create 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 23 09:49:24 compute-1 podman[75702]: 2026-01-23 09:49:24.16767125 +0000 UTC m=+0.026141506 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e5a23edbb65ae3813549cdaf0db86959f39a099db0f015b3bec080e11441f80/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e5a23edbb65ae3813549cdaf0db86959f39a099db0f015b3bec080e11441f80/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7e5a23edbb65ae3813549cdaf0db86959f39a099db0f015b3bec080e11441f80/merged/etc/ceph/ceph.client.crash.compute-1.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:24 compute-1 podman[75702]: 2026-01-23 09:49:24.295047906 +0000 UTC m=+0.153518162 container init 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:49:24 compute-1 podman[75702]: 2026-01-23 09:49:24.300648193 +0000 UTC m=+0.159118409 container start 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:49:24 compute-1 bash[75702]: 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f
Jan 23 09:49:24 compute-1 systemd[1]: Started Ceph crash.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:49:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: INFO:ceph-crash:pinging cluster to exercise our key
Jan 23 09:49:24 compute-1 sudo[75367]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.445+0000 7f52cace8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 09:49:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.445+0000 7f52cace8640 -1 AuthRegistry(0x7f52c40698f0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 09:49:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.446+0000 7f52cace8640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Jan 23 09:49:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.446+0000 7f52cace8640 -1 AuthRegistry(0x7f52cace6ff0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Jan 23 09:49:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.502+0000 7f52c8a5d640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Jan 23 09:49:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: 2026-01-23T09:49:24.502+0000 7f52cace8640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Jan 23 09:49:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: [errno 13] RADOS permission denied (error connecting to the cluster)
Jan 23 09:49:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1[75717]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Jan 23 09:49:25 compute-1 sudo[75735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:25 compute-1 sudo[75735]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:25 compute-1 sudo[75735]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:25 compute-1 sudo[75760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 --yes --no-systemd
Jan 23 09:49:25 compute-1 sudo[75760]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:25 compute-1 podman[75824]: 2026-01-23 09:49:25.848166406 +0000 UTC m=+0.076593696 container create f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:49:25 compute-1 podman[75824]: 2026-01-23 09:49:25.794782493 +0000 UTC m=+0.023209803 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:25 compute-1 systemd[1]: Started libpod-conmon-f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b.scope.
Jan 23 09:49:25 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:25 compute-1 podman[75824]: 2026-01-23 09:49:25.949931615 +0000 UTC m=+0.178358935 container init f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325)
Jan 23 09:49:25 compute-1 podman[75824]: 2026-01-23 09:49:25.960110137 +0000 UTC m=+0.188537447 container start f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 23 09:49:25 compute-1 musing_roentgen[75841]: 167 167
Jan 23 09:49:25 compute-1 systemd[1]: libpod-f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b.scope: Deactivated successfully.
Jan 23 09:49:25 compute-1 podman[75824]: 2026-01-23 09:49:25.979966923 +0000 UTC m=+0.208394233 container attach f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default)
Jan 23 09:49:25 compute-1 podman[75824]: 2026-01-23 09:49:25.980914192 +0000 UTC m=+0.209341512 container died f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:49:26 compute-1 systemd[1]: var-lib-containers-storage-overlay-a55b7ab61ea8d62c1744f01b99bdf1124094057c0b58eb7b39dfa65ae663ea2f-merged.mount: Deactivated successfully.
Jan 23 09:49:26 compute-1 podman[75824]: 2026-01-23 09:49:26.05121978 +0000 UTC m=+0.279647060 container remove f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=musing_roentgen, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:49:26 compute-1 systemd[1]: libpod-conmon-f187e44d104362bcea5a264ee70676e45c3af7347cd5663d430546c8db2ac19b.scope: Deactivated successfully.
Jan 23 09:49:26 compute-1 podman[75867]: 2026-01-23 09:49:26.273132138 +0000 UTC m=+0.104731124 container create d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 23 09:49:26 compute-1 podman[75867]: 2026-01-23 09:49:26.193973391 +0000 UTC m=+0.025572397 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:26 compute-1 systemd[1]: Started libpod-conmon-d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f.scope.
Jan 23 09:49:26 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:26 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:26 compute-1 podman[75867]: 2026-01-23 09:49:26.409845279 +0000 UTC m=+0.241444295 container init d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:49:26 compute-1 podman[75867]: 2026-01-23 09:49:26.420848966 +0000 UTC m=+0.252447952 container start d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:49:26 compute-1 podman[75867]: 2026-01-23 09:49:26.436594763 +0000 UTC m=+0.268193779 container attach d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, OSD_FLAVOR=default)
Jan 23 09:49:26 compute-1 trusting_wu[75883]: --> passed data devices: 0 physical, 1 LVM
Jan 23 09:49:26 compute-1 trusting_wu[75883]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:26 compute-1 trusting_wu[75883]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:26 compute-1 trusting_wu[75883]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 92663454-00ec-4b9a-bcda-939cb5c501aa
Jan 23 09:49:27 compute-1 trusting_wu[75883]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-0
Jan 23 09:49:27 compute-1 trusting_wu[75883]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Jan 23 09:49:27 compute-1 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 09:49:27 compute-1 trusting_wu[75883]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:27 compute-1 trusting_wu[75883]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-0/activate.monmap
Jan 23 09:49:27 compute-1 lvm[75947]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:49:27 compute-1 lvm[75947]: VG ceph_vg0 finished
Jan 23 09:49:27 compute-1 trusting_wu[75883]:  stderr: got monmap epoch 1
Jan 23 09:49:27 compute-1 trusting_wu[75883]: --> Creating keyring file for osd.0
Jan 23 09:49:27 compute-1 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/keyring
Jan 23 09:49:27 compute-1 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0/
Jan 23 09:49:27 compute-1 trusting_wu[75883]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 0 --monmap /var/lib/ceph/osd/ceph-0/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-0/ --osd-uuid 92663454-00ec-4b9a-bcda-939cb5c501aa --setuser ceph --setgroup ceph
Jan 23 09:49:36 compute-1 trusting_wu[75883]:  stderr: 2026-01-23T09:49:28.055+0000 7fe4104af740 -1 bluestore(/var/lib/ceph/osd/ceph-0//block) No valid bdev label found
Jan 23 09:49:36 compute-1 trusting_wu[75883]:  stderr: 2026-01-23T09:49:28.322+0000 7fe4104af740 -1 bluestore(/var/lib/ceph/osd/ceph-0/) _read_fsid unparsable uuid
Jan 23 09:49:36 compute-1 trusting_wu[75883]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Jan 23 09:49:36 compute-1 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 09:49:36 compute-1 trusting_wu[75883]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Jan 23 09:49:36 compute-1 trusting_wu[75883]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:36 compute-1 trusting_wu[75883]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:36 compute-1 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 09:49:36 compute-1 trusting_wu[75883]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 09:49:36 compute-1 trusting_wu[75883]: --> ceph-volume lvm activate successful for osd ID: 0
Jan 23 09:49:36 compute-1 trusting_wu[75883]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Jan 23 09:49:36 compute-1 systemd[1]: libpod-d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f.scope: Deactivated successfully.
Jan 23 09:49:36 compute-1 systemd[1]: libpod-d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f.scope: Consumed 2.217s CPU time.
Jan 23 09:49:36 compute-1 podman[75867]: 2026-01-23 09:49:36.641107024 +0000 UTC m=+10.472706040 container died d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:49:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-cab610d7b543bbde483fd06bdb9fbfd0d0a0dbc05935bc8835de568a72722c18-merged.mount: Deactivated successfully.
Jan 23 09:49:37 compute-1 podman[75867]: 2026-01-23 09:49:37.205300897 +0000 UTC m=+11.036899913 container remove d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=trusting_wu, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:49:37 compute-1 systemd[1]: libpod-conmon-d3d93f6e040426ef885a18aa269497962afbed3ba50ef59910f5782928f5d12f.scope: Deactivated successfully.
Jan 23 09:49:37 compute-1 sudo[75760]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:37 compute-1 sudo[76861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:37 compute-1 sudo[76861]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:37 compute-1 sudo[76861]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:37 compute-1 sudo[76886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- lvm list --format json
Jan 23 09:49:37 compute-1 sudo[76886]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:37 compute-1 podman[76951]: 2026-01-23 09:49:37.820398455 +0000 UTC m=+0.027695315 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:37 compute-1 podman[76951]: 2026-01-23 09:49:37.952017996 +0000 UTC m=+0.159314866 container create 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:49:38 compute-1 systemd[1]: Started libpod-conmon-6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517.scope.
Jan 23 09:49:38 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:38 compute-1 podman[76951]: 2026-01-23 09:49:38.093916071 +0000 UTC m=+0.301212991 container init 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default)
Jan 23 09:49:38 compute-1 podman[76951]: 2026-01-23 09:49:38.102677596 +0000 UTC m=+0.309974446 container start 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325)
Jan 23 09:49:38 compute-1 unruffled_faraday[76967]: 167 167
Jan 23 09:49:38 compute-1 systemd[1]: libpod-6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517.scope: Deactivated successfully.
Jan 23 09:49:38 compute-1 podman[76951]: 2026-01-23 09:49:38.116701409 +0000 UTC m=+0.323998269 container attach 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 23 09:49:38 compute-1 podman[76951]: 2026-01-23 09:49:38.117146153 +0000 UTC m=+0.324443023 container died 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 09:49:38 compute-1 systemd[1]: var-lib-containers-storage-overlay-6d592c6c6aee16d2ca07c8583a263c323d6f0eff28b30da164c6d4cfb1a74e09-merged.mount: Deactivated successfully.
Jan 23 09:49:38 compute-1 podman[76951]: 2026-01-23 09:49:38.312566505 +0000 UTC m=+0.519863345 container remove 6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=unruffled_faraday, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:49:38 compute-1 systemd[1]: libpod-conmon-6b80f3121ad68c3e23986041ce55f385093f45b3b2c1bdb120aca9114601b517.scope: Deactivated successfully.
Jan 23 09:49:38 compute-1 podman[76994]: 2026-01-23 09:49:38.502870607 +0000 UTC m=+0.036292635 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:38 compute-1 podman[76994]: 2026-01-23 09:49:38.600412114 +0000 UTC m=+0.133834132 container create cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:49:38 compute-1 systemd[1]: Started libpod-conmon-cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6.scope.
Jan 23 09:49:38 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:38 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:38 compute-1 podman[76994]: 2026-01-23 09:49:38.738897121 +0000 UTC m=+0.272319149 container init cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 09:49:38 compute-1 podman[76994]: 2026-01-23 09:49:38.749881087 +0000 UTC m=+0.283303105 container start cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 09:49:38 compute-1 podman[76994]: 2026-01-23 09:49:38.873586249 +0000 UTC m=+0.407008267 container attach cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 09:49:39 compute-1 blissful_elion[77011]: {
Jan 23 09:49:39 compute-1 blissful_elion[77011]:     "0": [
Jan 23 09:49:39 compute-1 blissful_elion[77011]:         {
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "devices": [
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "/dev/loop3"
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             ],
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "lv_name": "ceph_lv0",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "lv_size": "21470642176",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=PKvnhX-HhRc-31GG-pCfe-6hIl-d5aM-9QV6d5,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=f3005f84-239a-55b6-a948-8f1fb592b920,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=92663454-00ec-4b9a-bcda-939cb5c501aa,ceph.osd_id=0,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0,ceph.with_tpm=0",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "lv_uuid": "PKvnhX-HhRc-31GG-pCfe-6hIl-d5aM-9QV6d5",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "name": "ceph_lv0",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "path": "/dev/ceph_vg0/ceph_lv0",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "tags": {
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.block_uuid": "PKvnhX-HhRc-31GG-pCfe-6hIl-d5aM-9QV6d5",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.cephx_lockbox_secret": "",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.cluster_fsid": "f3005f84-239a-55b6-a948-8f1fb592b920",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.cluster_name": "ceph",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.crush_device_class": "",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.encrypted": "0",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.osd_fsid": "92663454-00ec-4b9a-bcda-939cb5c501aa",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.osd_id": "0",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.osdspec_affinity": "default_drive_group",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.type": "block",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.vdo": "0",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:                 "ceph.with_tpm": "0"
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             },
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "type": "block",
Jan 23 09:49:39 compute-1 blissful_elion[77011]:             "vg_name": "ceph_vg0"
Jan 23 09:49:39 compute-1 blissful_elion[77011]:         }
Jan 23 09:49:39 compute-1 blissful_elion[77011]:     ]
Jan 23 09:49:39 compute-1 blissful_elion[77011]: }
Jan 23 09:49:39 compute-1 systemd[1]: libpod-cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6.scope: Deactivated successfully.
Jan 23 09:49:39 compute-1 podman[77020]: 2026-01-23 09:49:39.090871301 +0000 UTC m=+0.025945809 container died cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 23 09:49:39 compute-1 systemd[1]: var-lib-containers-storage-overlay-d3a3ce797ca06334bc44281d0d04ba18f8c2def97ee5d0cd72fb3e42f4a6cc93-merged.mount: Deactivated successfully.
Jan 23 09:49:39 compute-1 podman[77020]: 2026-01-23 09:49:39.373684889 +0000 UTC m=+0.308759387 container remove cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=blissful_elion, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:49:39 compute-1 systemd[1]: libpod-conmon-cf73187f3871d670e7ae5e6f3ce51eb94d475653f4227dc3179b16e27d6612e6.scope: Deactivated successfully.
Jan 23 09:49:39 compute-1 sudo[76886]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:39 compute-1 sudo[77036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:39 compute-1 sudo[77036]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:39 compute-1 sudo[77036]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:39 compute-1 sudo[77061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:49:39 compute-1 sudo[77061]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:40 compute-1 podman[77126]: 2026-01-23 09:49:39.988484419 +0000 UTC m=+0.025025531 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:40 compute-1 podman[77126]: 2026-01-23 09:49:40.165292914 +0000 UTC m=+0.201834026 container create f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:49:40 compute-1 systemd[1]: Started libpod-conmon-f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3.scope.
Jan 23 09:49:40 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:40 compute-1 podman[77126]: 2026-01-23 09:49:40.339452706 +0000 UTC m=+0.375993858 container init f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:49:40 compute-1 podman[77126]: 2026-01-23 09:49:40.345859688 +0000 UTC m=+0.382400790 container start f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default)
Jan 23 09:49:40 compute-1 elated_golick[77143]: 167 167
Jan 23 09:49:40 compute-1 systemd[1]: libpod-f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3.scope: Deactivated successfully.
Jan 23 09:49:40 compute-1 podman[77126]: 2026-01-23 09:49:40.362154172 +0000 UTC m=+0.398695354 container attach f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 09:49:40 compute-1 podman[77126]: 2026-01-23 09:49:40.36272551 +0000 UTC m=+0.399266652 container died f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325)
Jan 23 09:49:40 compute-1 systemd[1]: var-lib-containers-storage-overlay-ac93b8b960b88f01538dfc377c79ac43fbe7d2360e229169fbea32c69e52f84f-merged.mount: Deactivated successfully.
Jan 23 09:49:40 compute-1 podman[77126]: 2026-01-23 09:49:40.454263527 +0000 UTC m=+0.490804669 container remove f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elated_golick, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 23 09:49:40 compute-1 systemd[1]: libpod-conmon-f0124d571157d71419623c6138bc1f51a6bf43b3dac06346bddf22b7aa035ed3.scope: Deactivated successfully.
Jan 23 09:49:40 compute-1 podman[77174]: 2026-01-23 09:49:40.758664807 +0000 UTC m=+0.047016175 container create 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:49:40 compute-1 podman[77174]: 2026-01-23 09:49:40.734104651 +0000 UTC m=+0.022455999 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:40 compute-1 systemd[1]: Started libpod-conmon-9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8.scope.
Jan 23 09:49:40 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:40 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:40 compute-1 podman[77174]: 2026-01-23 09:49:40.900921123 +0000 UTC m=+0.189272481 container init 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 23 09:49:40 compute-1 podman[77174]: 2026-01-23 09:49:40.913876071 +0000 UTC m=+0.202227399 container start 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:49:41 compute-1 podman[77174]: 2026-01-23 09:49:41.017720546 +0000 UTC m=+0.306071904 container attach 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:49:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test[77190]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_FSID]
Jan 23 09:49:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test[77190]:                             [--no-systemd] [--no-tmpfs]
Jan 23 09:49:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test[77190]: ceph-volume activate: error: unrecognized arguments: --bad-option
Jan 23 09:49:41 compute-1 systemd[1]: libpod-9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8.scope: Deactivated successfully.
Jan 23 09:49:41 compute-1 podman[77174]: 2026-01-23 09:49:41.117646767 +0000 UTC m=+0.405998125 container died 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:49:41 compute-1 systemd[1]: var-lib-containers-storage-overlay-ff61a312d9a36c6ae0de2ffd8d95f440e7b4ea17fdb3c8ecccaaefe1bf14c579-merged.mount: Deactivated successfully.
Jan 23 09:49:41 compute-1 podman[77174]: 2026-01-23 09:49:41.513351696 +0000 UTC m=+0.801703034 container remove 9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate-test, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:49:41 compute-1 systemd[1]: libpod-conmon-9e49d4131b86f62aa64c571d771ae47fab31489a4333631155e9dc38362c12e8.scope: Deactivated successfully.
Jan 23 09:49:42 compute-1 systemd[1]: Reloading.
Jan 23 09:49:42 compute-1 systemd-rc-local-generator[77247]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:49:42 compute-1 systemd-sysv-generator[77252]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:49:42 compute-1 systemd[1]: Reloading.
Jan 23 09:49:42 compute-1 systemd-rc-local-generator[77289]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:49:42 compute-1 systemd-sysv-generator[77295]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:49:42 compute-1 systemd[1]: Starting Ceph osd.0 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:49:42 compute-1 podman[77350]: 2026-01-23 09:49:42.751904036 +0000 UTC m=+0.050207855 container create 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 09:49:42 compute-1 podman[77350]: 2026-01-23 09:49:42.728123915 +0000 UTC m=+0.026427894 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:42 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:42 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:42 compute-1 podman[77350]: 2026-01-23 09:49:42.976001672 +0000 UTC m=+0.274305581 container init 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:49:42 compute-1 podman[77350]: 2026-01-23 09:49:42.987240867 +0000 UTC m=+0.285544696 container start 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1)
Jan 23 09:49:43 compute-1 podman[77350]: 2026-01-23 09:49:43.052226196 +0000 UTC m=+0.350530015 container attach 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, io.buildah.version=1.40.1)
Jan 23 09:49:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:43 compute-1 bash[77350]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:43 compute-1 bash[77350]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:43 compute-1 lvm[77446]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:49:43 compute-1 lvm[77446]: VG ceph_vg0 finished
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:44 compute-1 bash[77350]: --> Failed to activate via raw: did not find any matching OSD to activate
Jan 23 09:49:44 compute-1 bash[77350]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:44 compute-1 bash[77350]: Running command: /usr/bin/ceph-authtool --gen-print-key
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 09:49:44 compute-1 bash[77350]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Jan 23 09:49:44 compute-1 bash[77350]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-0 --no-mon-config
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:44 compute-1 bash[77350]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:44 compute-1 bash[77350]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 09:49:44 compute-1 bash[77350]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 09:49:44 compute-1 bash[77350]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0
Jan 23 09:49:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate[77365]: --> ceph-volume lvm activate successful for osd ID: 0
Jan 23 09:49:44 compute-1 bash[77350]: --> ceph-volume lvm activate successful for osd ID: 0
Jan 23 09:49:44 compute-1 systemd[1]: libpod-64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f.scope: Deactivated successfully.
Jan 23 09:49:44 compute-1 systemd[1]: libpod-64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f.scope: Consumed 1.750s CPU time.
Jan 23 09:49:44 compute-1 podman[77540]: 2026-01-23 09:49:44.591243962 +0000 UTC m=+0.032728834 container died 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:49:44 compute-1 systemd[1]: var-lib-containers-storage-overlay-17a2e60604994e78a3a6c47d66a9f99280a8d12d3b187d3c254429567c6aa8a5-merged.mount: Deactivated successfully.
Jan 23 09:49:44 compute-1 podman[77540]: 2026-01-23 09:49:44.792376645 +0000 UTC m=+0.233861437 container remove 64545c34976279fa23938487799ec0127080ce9ce875ab7d4edd1b22a09e802f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0-activate, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Jan 23 09:49:45 compute-1 podman[77599]: 2026-01-23 09:49:45.123119985 +0000 UTC m=+0.118171028 container create 70bc56e6e481c715160044ab59ecb88ebb44c3388a9f7a7a34bc220e894d037b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:49:45 compute-1 podman[77599]: 2026-01-23 09:49:45.038714543 +0000 UTC m=+0.033765586 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:45 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e0ce4fbabd5ffefa68bf9aff2293a48738565926dfa149780d0ae8800db74858/merged/var/lib/ceph/osd/ceph-0 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:45 compute-1 podman[77599]: 2026-01-23 09:49:45.205747841 +0000 UTC m=+0.200798944 container init 70bc56e6e481c715160044ab59ecb88ebb44c3388a9f7a7a34bc220e894d037b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:49:45 compute-1 podman[77599]: 2026-01-23 09:49:45.210691976 +0000 UTC m=+0.205743019 container start 70bc56e6e481c715160044ab59ecb88ebb44c3388a9f7a7a34bc220e894d037b (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1)
Jan 23 09:49:45 compute-1 bash[77599]: 70bc56e6e481c715160044ab59ecb88ebb44c3388a9f7a7a34bc220e894d037b
Jan 23 09:49:45 compute-1 systemd[1]: Started Ceph osd.0 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:49:45 compute-1 ceph-osd[77616]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:49:45 compute-1 ceph-osd[77616]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-osd, pid 2
Jan 23 09:49:45 compute-1 ceph-osd[77616]: pidfile_write: ignore empty --pid-file
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:45 compute-1 sudo[77061]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Jan 23 09:49:45 compute-1 ceph-osd[77616]: bdev(0x55a55e89bc00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55e89b800 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:46 compute-1 ceph-osd[77616]: starting osd.0 osd_data /var/lib/ceph/osd/ceph-0 /var/lib/ceph/osd/ceph-0/journal
Jan 23 09:49:46 compute-1 ceph-osd[77616]: load: jerasure load: lrc 
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:46 compute-1 ceph-osd[77616]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Jan 23 09:49:46 compute-1 ceph-osd[77616]: osd.0:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:46 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f740c00 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 kv_onode 0.04 data 0.06
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount shared_bdev_used = 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: RocksDB version: 7.9.2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Git sha 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: DB SUMMARY
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: DB Session ID:  KYUCMB80H616SE245L90
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: CURRENT file:  CURRENT
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                         Options.error_if_exists: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.create_if_missing: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                                     Options.env: 0x55a55f711dc0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                                Options.info_log: 0x55a55f7157a0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                              Options.statistics: (nil)
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.use_fsync: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                              Options.db_log_dir: 
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.write_buffer_manager: 0x55a55f80aa00
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.unordered_write: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.row_cache: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                              Options.wal_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.two_write_queues: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.wal_compression: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.atomic_flush: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.max_background_jobs: 4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.max_background_compactions: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.max_subcompactions: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.max_open_files: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Compression algorithms supported:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kZSTD supported: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kXpressCompression supported: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kBZip2Compression supported: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kZSTDNotFinalCompression supported: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kLZ4Compression supported: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kZlibCompression supported: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kLZ4HCCompression supported: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kSnappyCompression supported: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b60)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e9309b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e9309b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715b80)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e9309b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: edb5d1d6-d8de-4399-98b8-c0de0b841c0c
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161787309052, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161787309291, "job": 1, "event": "recovery_finished"}
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old nid_max 1025
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta old blobid_max 10240
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_super_meta min_alloc_size 0x1000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: freelist init
Jan 23 09:49:47 compute-1 ceph-osd[77616]: freelist _read_cfg
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _init_alloc loaded 20 GiB in 2 extents, allocator type hybrid, capacity 0x4ffc00000, block size 0x1000, free 0x4ffbfd000, fragmentation 1.9e-07
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs umount
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) close
Jan 23 09:49:47 compute-1 sudo[77852]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:47 compute-1 sudo[77852]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:47 compute-1 sudo[77852]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:47 compute-1 sudo[77877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- raw list --format json
Jan 23 09:49:47 compute-1 sudo[77877]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open path /var/lib/ceph/osd/ceph-0/block
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-0/block failed: (22) Invalid argument
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open backing device/file reports st_blksize 512, using bdev_block_size 4096 anyway
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bdev(0x55a55f741000 /var/lib/ceph/osd/ceph-0/block) open size 21470642176 (0x4ffc00000, 20 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-0/block size 20 GiB
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs _init_alloc shared, id 1, capacity 0x4ffc00000, block size 0x10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 0 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 1 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 2 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 3 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount final locked allocations 4 <0x0~0, [0x0~0], 0x0~0> => <0x0~0, [0x0~0], 0x0~0>
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluefs mount shared_bdev_used = 4718592
Jan 23 09:49:47 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _prepare_db_environment set db_paths to db,20397110067 db.slow,20397110067
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: RocksDB version: 7.9.2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Git sha 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: DB SUMMARY
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: DB Session ID:  KYUCMB80H616SE245L91
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: CURRENT file:  CURRENT
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5097 ; 
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                         Options.error_if_exists: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.create_if_missing: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                                     Options.env: 0x55a55f8ae9a0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                                      Options.fs: LegacyFileSystem
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                                Options.info_log: 0x55a55f715940
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                              Options.statistics: (nil)
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.use_fsync: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                              Options.db_log_dir: 
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                                 Options.wal_dir: db.wal
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.write_buffer_manager: 0x55a55f80aa00
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.unordered_write: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.row_cache: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                              Options.wal_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.two_write_queues: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.wal_compression: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.atomic_flush: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.max_background_jobs: 4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.max_background_compactions: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.max_subcompactions: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.writable_file_max_buffer_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.max_total_wal_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.max_open_files: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.compaction_readahead_size: 2097152
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Compression algorithms supported:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kZSTD supported: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kXpressCompression supported: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kBZip2Compression supported: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kZSTDNotFinalCompression supported: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kLZ4Compression supported: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kZlibCompression supported: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kLZ4HCCompression supported: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         kSnappyCompression supported: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715680)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e931350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 483183820
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e9309b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e9309b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:           Options.merge_operator: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55a55f715ac0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x55a55e9309b0
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.write_buffer_size: 16777216
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.max_write_buffer_number: 64
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.compression: LZ4
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.num_levels: 7
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: edb5d1d6-d8de-4399-98b8-c0de0b841c0c
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161787583393, "job": 1, "event": "recovery_started", "wal_files": [31]}
Jan 23 09:49:47 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Jan 23 09:49:48 compute-1 podman[78123]: 2026-01-23 09:49:47.917729703 +0000 UTC m=+0.024168300 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:51 compute-1 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161791589979, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1272, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161787, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "edb5d1d6-d8de-4399-98b8-c0de0b841c0c", "db_session_id": "KYUCMB80H616SE245L91", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:49:51 compute-1 podman[78123]: 2026-01-23 09:49:51.626736577 +0000 UTC m=+3.733175154 container create 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, ceph=True)
Jan 23 09:49:51 compute-1 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161791636078, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1595, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 469, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 571, "raw_average_value_size": 285, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161791, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "edb5d1d6-d8de-4399-98b8-c0de0b841c0c", "db_session_id": "KYUCMB80H616SE245L91", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:49:51 compute-1 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161791639730, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1275, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161791, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "edb5d1d6-d8de-4399-98b8-c0de0b841c0c", "db_session_id": "KYUCMB80H616SE245L91", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:49:51 compute-1 ceph-osd[77616]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161791642107, "job": 1, "event": "recovery_finished"}
Jan 23 09:49:51 compute-1 ceph-osd[77616]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Jan 23 09:49:51 compute-1 systemd[1]: Started libpod-conmon-8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6.scope.
Jan 23 09:49:51 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:51 compute-1 podman[78123]: 2026-01-23 09:49:51.739075591 +0000 UTC m=+3.845514178 container init 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2)
Jan 23 09:49:51 compute-1 podman[78123]: 2026-01-23 09:49:51.746608152 +0000 UTC m=+3.853046709 container start 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:49:51 compute-1 pensive_albattani[78141]: 167 167
Jan 23 09:49:51 compute-1 systemd[1]: libpod-8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6.scope: Deactivated successfully.
Jan 23 09:49:51 compute-1 conmon[78141]: conmon 8e52773d66f0d45762a8 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6.scope/container/memory.events
Jan 23 09:49:51 compute-1 podman[78123]: 2026-01-23 09:49:51.77340427 +0000 UTC m=+3.879842937 container attach 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:49:51 compute-1 podman[78123]: 2026-01-23 09:49:51.774423436 +0000 UTC m=+3.880862003 container died 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 23 09:49:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55a55f912000
Jan 23 09:49:51 compute-1 ceph-osd[77616]: rocksdb: DB pointer 0x55a55f8bc000
Jan 23 09:49:51 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Jan 23 09:49:51 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super from 4, latest 4
Jan 23 09:49:51 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _upgrade_super done
Jan 23 09:49:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 09:49:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 4.3 total, 4.3 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.3 total, 4.3 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.3 total, 4.3 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.3 total, 4.3 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.3 total, 4.3 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.3 total, 4.3 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.2 total, 4.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.2 total, 4.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.2 total, 4.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.2 total, 4.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.2 total, 4.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.2 total, 4.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 4.2 total, 4.2 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 09:49:51 compute-1 ceph-osd[77616]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Jan 23 09:49:51 compute-1 ceph-osd[77616]: <cls> /home/jenkins-build/build/workspace/ceph-build/ARCH/x86_64/AVAILABLE_ARCH/x86_64/AVAILABLE_DIST/centos9/DIST/centos9/MACHINE_SIZE/gigantic/release/19.2.3/rpm/el9/BUILD/ceph-19.2.3/src/cls/hello/cls_hello.cc:316: loading cls_hello
Jan 23 09:49:51 compute-1 ceph-osd[77616]: _get_class not permitted to load lua
Jan 23 09:49:51 compute-1 ceph-osd[77616]: _get_class not permitted to load sdk
Jan 23 09:49:51 compute-1 ceph-osd[77616]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Jan 23 09:49:51 compute-1 ceph-osd[77616]: osd.0 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Jan 23 09:49:51 compute-1 ceph-osd[77616]: osd.0 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Jan 23 09:49:51 compute-1 ceph-osd[77616]: osd.0 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Jan 23 09:49:51 compute-1 ceph-osd[77616]: osd.0 0 load_pgs
Jan 23 09:49:51 compute-1 ceph-osd[77616]: osd.0 0 load_pgs opened 0 pgs
Jan 23 09:49:51 compute-1 ceph-osd[77616]: osd.0 0 log_to_monitors true
Jan 23 09:49:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0[77612]: 2026-01-23T09:49:51.826+0000 7f80f7b74740 -1 osd.0 0 log_to_monitors true
Jan 23 09:49:51 compute-1 systemd[1]: var-lib-containers-storage-overlay-655734e702f595c2054344cff8329887765228ffdd492f8b3ccc4b124e06f8eb-merged.mount: Deactivated successfully.
Jan 23 09:49:51 compute-1 podman[78123]: 2026-01-23 09:49:51.903477169 +0000 UTC m=+4.009915776 container remove 8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=pensive_albattani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:49:51 compute-1 systemd[1]: libpod-conmon-8e52773d66f0d45762a82362629422291328714aca68f479d939256975a783a6.scope: Deactivated successfully.
Jan 23 09:49:52 compute-1 podman[78198]: 2026-01-23 09:49:52.080401251 +0000 UTC m=+0.071998027 container create 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:49:52 compute-1 systemd[1]: Started libpod-conmon-9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7.scope.
Jan 23 09:49:52 compute-1 podman[78198]: 2026-01-23 09:49:52.0316169 +0000 UTC m=+0.023213656 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:52 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:52 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:52 compute-1 podman[78198]: 2026-01-23 09:49:52.166286967 +0000 UTC m=+0.157883793 container init 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_REF=squid)
Jan 23 09:49:52 compute-1 podman[78198]: 2026-01-23 09:49:52.178898134 +0000 UTC m=+0.170494910 container start 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0)
Jan 23 09:49:52 compute-1 podman[78198]: 2026-01-23 09:49:52.2399259 +0000 UTC m=+0.231522636 container attach 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:49:52 compute-1 lvm[78288]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 09:49:52 compute-1 lvm[78288]: VG ceph_vg0 finished
Jan 23 09:49:52 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Jan 23 09:49:52 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Jan 23 09:49:52 compute-1 loving_knuth[78214]: {}
Jan 23 09:49:52 compute-1 systemd[1]: libpod-9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7.scope: Deactivated successfully.
Jan 23 09:49:52 compute-1 podman[78198]: 2026-01-23 09:49:52.88597129 +0000 UTC m=+0.877568026 container died 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:49:52 compute-1 systemd[1]: libpod-9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7.scope: Consumed 1.150s CPU time.
Jan 23 09:49:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-bd80e343285952f4707aa0e15410b33def7cc599eb9d053cec110c8339324791-merged.mount: Deactivated successfully.
Jan 23 09:49:52 compute-1 podman[78198]: 2026-01-23 09:49:52.932085047 +0000 UTC m=+0.923681773 container remove 9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=loving_knuth, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:49:52 compute-1 ceph-osd[77616]: osd.0 0 done with init, starting boot process
Jan 23 09:49:52 compute-1 ceph-osd[77616]: osd.0 0 start_boot
Jan 23 09:49:52 compute-1 ceph-osd[77616]: osd.0 0 maybe_override_options_for_qos osd_max_backfills set to 1
Jan 23 09:49:52 compute-1 ceph-osd[77616]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Jan 23 09:49:52 compute-1 ceph-osd[77616]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Jan 23 09:49:52 compute-1 ceph-osd[77616]: osd.0 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Jan 23 09:49:52 compute-1 ceph-osd[77616]: osd.0 0  bench count 12288000 bsize 4 KiB
Jan 23 09:49:52 compute-1 systemd[1]: libpod-conmon-9e9e2d0acf895114575a083a9ba6682d79eba6bee9caee3d1b85ddef2d3ed4b7.scope: Deactivated successfully.
Jan 23 09:49:52 compute-1 sudo[77877]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:53 compute-1 sudo[78304]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:49:53 compute-1 sudo[78304]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:53 compute-1 sudo[78304]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:53 compute-1 sudo[78329]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:53 compute-1 sudo[78329]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:53 compute-1 sudo[78329]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:53 compute-1 sudo[78354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:49:53 compute-1 sudo[78354]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:54 compute-1 podman[78450]: 2026-01-23 09:49:54.262455525 +0000 UTC m=+0.408605672 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:49:54 compute-1 podman[78470]: 2026-01-23 09:49:54.490716957 +0000 UTC m=+0.110803692 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid)
Jan 23 09:49:54 compute-1 podman[78450]: 2026-01-23 09:49:54.62933951 +0000 UTC m=+0.775489697 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS)
Jan 23 09:49:55 compute-1 sudo[78354]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:55 compute-1 sudo[78499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:55 compute-1 sudo[78499]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:55 compute-1 sudo[78499]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:55 compute-1 sudo[78524]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:49:55 compute-1 sudo[78524]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:55 compute-1 sudo[78524]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:55 compute-1 sudo[78580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:49:55 compute-1 sudo[78580]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:55 compute-1 sudo[78580]: pam_unix(sudo:session): session closed for user root
Jan 23 09:49:55 compute-1 sudo[78605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- inventory --format=json-pretty --filter-for-batch
Jan 23 09:49:55 compute-1 sudo[78605]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:49:56 compute-1 podman[78671]: 2026-01-23 09:49:56.386100666 +0000 UTC m=+0.117839785 container create 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 23 09:49:56 compute-1 podman[78671]: 2026-01-23 09:49:56.296702838 +0000 UTC m=+0.028441987 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:56 compute-1 systemd[1]: Started libpod-conmon-3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a.scope.
Jan 23 09:49:56 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:56 compute-1 podman[78671]: 2026-01-23 09:49:56.898996691 +0000 UTC m=+0.630735810 container init 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 23 09:49:56 compute-1 podman[78671]: 2026-01-23 09:49:56.905699503 +0000 UTC m=+0.637438642 container start 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True)
Jan 23 09:49:56 compute-1 modest_murdock[78687]: 167 167
Jan 23 09:49:56 compute-1 systemd[1]: libpod-3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a.scope: Deactivated successfully.
Jan 23 09:49:57 compute-1 podman[78671]: 2026-01-23 09:49:57.046175891 +0000 UTC m=+0.777915020 container attach 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 09:49:57 compute-1 podman[78671]: 2026-01-23 09:49:57.047186466 +0000 UTC m=+0.778925575 container died 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:49:57 compute-1 systemd[1]: var-lib-containers-storage-overlay-f9aca6e6255188e68cd156b8f3472d2ad8076950b3211f23582ffdbcd86fd0c1-merged.mount: Deactivated successfully.
Jan 23 09:49:57 compute-1 podman[78671]: 2026-01-23 09:49:57.831485018 +0000 UTC m=+1.563224137 container remove 3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=modest_murdock, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:49:57 compute-1 systemd[1]: libpod-conmon-3718d69e5daa0d7d8bb2bc351c52abdee44ae6b3cd539d2cca193410b1a9750a.scope: Deactivated successfully.
Jan 23 09:49:58 compute-1 podman[78712]: 2026-01-23 09:49:58.03610559 +0000 UTC m=+0.045973403 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:49:58 compute-1 podman[78712]: 2026-01-23 09:49:58.214743492 +0000 UTC m=+0.224611265 container create d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:49:58 compute-1 systemd[1]: Started libpod-conmon-d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310.scope.
Jan 23 09:49:58 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:49:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:58 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:49:58 compute-1 podman[78712]: 2026-01-23 09:49:58.704849877 +0000 UTC m=+0.714717670 container init d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 09:49:58 compute-1 podman[78712]: 2026-01-23 09:49:58.712874056 +0000 UTC m=+0.722741859 container start d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 09:49:58 compute-1 podman[78712]: 2026-01-23 09:49:58.931813804 +0000 UTC m=+0.941681657 container attach d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:49:59 compute-1 condescending_jang[78728]: [
Jan 23 09:49:59 compute-1 condescending_jang[78728]:     {
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         "available": false,
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         "being_replaced": false,
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         "ceph_device_lvm": false,
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         "lsm_data": {},
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         "lvs": [],
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         "path": "/dev/sr0",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         "rejected_reasons": [
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "Has a FileSystem",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "Insufficient space (<5GB)"
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         ],
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         "sys_api": {
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "actuators": null,
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "device_nodes": [
Jan 23 09:49:59 compute-1 condescending_jang[78728]:                 "sr0"
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             ],
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "devname": "sr0",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "human_readable_size": "482.00 KB",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "id_bus": "ata",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "model": "QEMU DVD-ROM",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "nr_requests": "2",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "parent": "/dev/sr0",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "partitions": {},
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "path": "/dev/sr0",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "removable": "1",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "rev": "2.5+",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "ro": "0",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "rotational": "1",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "sas_address": "",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "sas_device_handle": "",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "scheduler_mode": "mq-deadline",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "sectors": 0,
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "sectorsize": "2048",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "size": 493568.0,
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "support_discard": "2048",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "type": "disk",
Jan 23 09:49:59 compute-1 condescending_jang[78728]:             "vendor": "QEMU"
Jan 23 09:49:59 compute-1 condescending_jang[78728]:         }
Jan 23 09:49:59 compute-1 condescending_jang[78728]:     }
Jan 23 09:49:59 compute-1 condescending_jang[78728]: ]
Jan 23 09:49:59 compute-1 systemd[1]: libpod-d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310.scope: Deactivated successfully.
Jan 23 09:49:59 compute-1 podman[78712]: 2026-01-23 09:49:59.439328583 +0000 UTC m=+1.449196376 container died d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:49:59 compute-1 systemd[1]: var-lib-containers-storage-overlay-8757a2531f36f34e9006b899bd5023f7bb3d978eab268c4302422e7879e271bb-merged.mount: Deactivated successfully.
Jan 23 09:49:59 compute-1 podman[78712]: 2026-01-23 09:49:59.909778887 +0000 UTC m=+1.919646660 container remove d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=condescending_jang, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:49:59 compute-1 sudo[78605]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:00 compute-1 systemd[1]: libpod-conmon-d08028405d724364e21afe8d69e1e25c42be3f8d187898ebaeed5d825ef58310.scope: Deactivated successfully.
Jan 23 09:50:03 compute-1 ceph-osd[77616]: osd.0 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 13.396 iops: 3429.483 elapsed_sec: 0.875
Jan 23 09:50:03 compute-1 ceph-osd[77616]: log_channel(cluster) log [WRN] : OSD bench result of 3429.482546 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.0. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 09:50:03 compute-1 ceph-osd[77616]: osd.0 0 waiting for initial osdmap
Jan 23 09:50:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0[77612]: 2026-01-23T09:50:03.888+0000 7f80f430a640 -1 osd.0 0 waiting for initial osdmap
Jan 23 09:50:03 compute-1 ceph-osd[77616]: osd.0 16 crush map has features 288514051259236352, adjusting msgr requires for clients
Jan 23 09:50:03 compute-1 ceph-osd[77616]: osd.0 16 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Jan 23 09:50:03 compute-1 ceph-osd[77616]: osd.0 16 crush map has features 3314933000852226048, adjusting msgr requires for osds
Jan 23 09:50:03 compute-1 ceph-osd[77616]: osd.0 16 check_osdmap_features require_osd_release unknown -> squid
Jan 23 09:50:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-osd-0[77612]: 2026-01-23T09:50:03.929+0000 7f80ef11f640 -1 osd.0 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 09:50:03 compute-1 ceph-osd[77616]: osd.0 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Jan 23 09:50:03 compute-1 ceph-osd[77616]: osd.0 16 set_numa_affinity not setting numa affinity
Jan 23 09:50:03 compute-1 ceph-osd[77616]: osd.0 16 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial no unique device path for loop3: no symlink to loop3 in /dev/disk/by-path
Jan 23 09:50:04 compute-1 ceph-osd[77616]: osd.0 17 state: booting -> active
Jan 23 09:50:04 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 17 pg[1.0( empty local-lis/les=0/0 n=0 ec=11/11 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 pi=[11,17)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:04 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 17 pg[2.0( empty local-lis/les=0/0 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 pi=[13,17)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:05 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 18 pg[7.0( empty local-lis/les=0/0 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:05 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 18 pg[2.0( empty local-lis/les=17/18 n=0 ec=13/13 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 pi=[13,17)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:05 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 18 pg[1.0( empty local-lis/les=17/18 n=0 ec=11/11 lis/c=0/0 les/c/f=0/0/0 sis=17) [0] r=0 lpr=17 pi=[11,17)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:06 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 19 pg[7.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=0/0 les/c/f=0/0/0 sis=18) [0] r=0 lpr=18 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 24 pg[2.0( empty local-lis/les=17/18 n=0 ec=13/13 lis/c=17/17 les/c/f=18/18/0 sis=24 pruub=9.220177650s) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active pruub 30.096221924s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 24 pg[2.0( empty local-lis/les=17/18 n=0 ec=13/13 lis/c=17/17 les/c/f=18/18/0 sis=24 pruub=9.220177650s) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown pruub 30.096221924s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1f( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1c( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1d( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1b( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1e( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.8( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.7( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.9( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.a( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.6( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.4( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.2( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.5( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.3( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.b( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.c( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.e( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.d( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.f( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.10( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.11( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.12( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.13( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.14( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.15( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.17( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.18( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.19( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.16( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1a( empty local-lis/les=17/18 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.7( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.8( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.9( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.a( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.4( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.6( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.2( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.3( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.0( empty local-lis/les=24/25 n=0 ec=13/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.11( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.14( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.13( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.15( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.17( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.10( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.19( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.1a( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 25 pg[2.16( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=17/17 les/c/f=18/18/0 sis=24) [0] r=0 lpr=24 pi=[17,24)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:13 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1e scrub starts
Jan 23 09:50:13 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1e scrub ok
Jan 23 09:50:14 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1d scrub starts
Jan 23 09:50:14 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1d scrub ok
Jan 23 09:50:15 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1f scrub starts
Jan 23 09:50:15 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1f scrub ok
Jan 23 09:50:16 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 27 pg[7.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=27 pruub=14.500118256s) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active pruub 38.990123749s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:16 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 27 pg[7.0( empty local-lis/les=18/19 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=27 pruub=14.500118256s) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown pruub 38.990123749s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:16 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1c scrub starts
Jan 23 09:50:16 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1c scrub ok
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1f( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1d( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1c( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.12( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.13( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.11( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.10( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.16( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.17( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.14( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.15( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.a( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.8( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.9( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.b( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.e( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.6( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.5( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.4( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.7( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.3( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.2( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.d( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.c( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.f( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1e( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.18( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.19( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1b( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1a( empty local-lis/les=18/19 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1d( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.12( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1c( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.10( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.14( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.17( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.a( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.8( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.13( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.9( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.0( empty local-lis/les=27/28 n=0 ec=18/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.6( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.4( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.7( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.3( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.2( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.d( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.c( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.18( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.19( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.1a( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 28 pg[7.15( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=18/18 les/c/f=19/19/0 sis=27) [0] r=0 lpr=27 pi=[18,27)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:17 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.7 deep-scrub starts
Jan 23 09:50:17 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.7 deep-scrub ok
Jan 23 09:50:19 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.9 scrub starts
Jan 23 09:50:19 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.9 scrub ok
Jan 23 09:50:20 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Jan 23 09:50:20 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Jan 23 09:50:21 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.6 scrub starts
Jan 23 09:50:21 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.6 scrub ok
Jan 23 09:50:22 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.a scrub starts
Jan 23 09:50:22 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.a scrub ok
Jan 23 09:50:23 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Jan 23 09:50:23 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Jan 23 09:50:24 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.0 deep-scrub starts
Jan 23 09:50:24 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.0 deep-scrub ok
Jan 23 09:50:25 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.4 scrub starts
Jan 23 09:50:25 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.4 scrub ok
Jan 23 09:50:26 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1 scrub starts
Jan 23 09:50:26 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1 scrub ok
Jan 23 09:50:27 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Jan 23 09:50:27 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.18( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.1a( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.18( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.19( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1a( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.1c( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.1b( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1b( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.1d( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.1a( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1c( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.1a( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.c( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.e( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.e( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.d( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.e( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.f( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.9( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.1( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.3( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.2( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.5( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.3( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.5( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.2( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.7( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.5( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.4( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.7( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.d( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.a( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.a( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.8( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.d( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.c( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.9( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.8( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.a( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.f( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.e( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.9( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.11( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.16( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.10( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.15( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.13( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.15( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.15( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.17( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.15( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.13( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.14( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.11( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.12( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[3.16( empty local-lis/les=0/0 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.10( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[5.1f( empty local-lis/les=0/0 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.1c( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[4.1f( empty local-lis/les=0/0 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[6.1e( empty local-lis/les=0/0 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.19( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569741249s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228610992s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.19( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569710732s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228610992s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1d( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927922249s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.586837769s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.15( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569553375s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228507996s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1d( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927888870s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.586837769s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.15( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569536209s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228507996s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.10( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927852631s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.586887360s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.13( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927933693s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.586959839s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.10( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927838326s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.586887360s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.13( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927897453s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.586959839s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.13( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569301605s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228446960s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.13( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569286346s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228446960s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.14( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927904129s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587131500s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.14( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927886963s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587131500s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.10( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569243431s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228557587s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.a( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927868843s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587200165s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.a( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927852631s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587200165s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568997383s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228363037s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.10( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.569215775s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228557587s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568982124s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228363037s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927917480s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587375641s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568883896s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228378296s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927903175s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587375641s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568869591s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228378296s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.8( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927607536s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587207794s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568799019s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228431702s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568782806s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228431702s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.8( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927577972s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587207794s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.9( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927671432s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587364197s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.9( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927655220s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587364197s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927598953s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587387085s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927581787s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587387085s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.6( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927831650s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587646484s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.6( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927813530s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587646484s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.4( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927702904s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587638855s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568323135s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228282928s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568302155s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228282928s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.4( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568067551s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228092194s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.6( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568223000s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228286743s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.4( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568034172s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228092194s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.6( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.568204880s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228286743s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.3( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927593231s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587711334s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.3( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927577019s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587711334s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.2( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927405357s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587703705s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.a( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567815781s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228160858s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.9( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567931175s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.228275299s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.a( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567797661s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228160858s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.9( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567910194s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.228275299s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.2( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927258492s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587703705s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927349091s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587882996s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1e( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927308083s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587882996s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927147865s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587757111s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567054749s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.227695465s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.567035675s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.227695465s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927106857s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587757111s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927084923s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587886810s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.566875458s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.227695465s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.1b( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.927068710s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587886810s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.4( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.926847458s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587638855s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1e( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.566858292s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.227695465s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.566905975s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active pruub 45.227867126s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[2.1f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=30 pruub=9.566858292s) [1] r=-1 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 45.227867126s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.18( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.926668167s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 active pruub 49.587890625s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:50:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 30 pg[7.18( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=30 pruub=13.926606178s) [1] r=-1 lpr=30 pi=[27,30)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 49.587890625s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:50:28 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1f scrub starts
Jan 23 09:50:28 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1f scrub ok
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.18( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.19( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.18( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.1a( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.1c( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.1b( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1b( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1c( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.1a( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.d( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.c( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.e( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.f( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.3( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.2( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.5( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.2( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.5( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.3( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.5( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.7( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.7( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.8( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.d( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.a( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.c( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.9( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.a( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.15( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.10( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.13( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.f( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.16( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.15( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.13( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.11( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.14( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.1f( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[3.16( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[5.10( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=24/24 les/c/f=25/25/0 sis=30) [0] r=0 lpr=30 pi=[24,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:28 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 31 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=26/26 les/c/f=29/29/0 sis=30) [0] r=0 lpr=30 pi=[26,30)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:50:29 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1c scrub starts
Jan 23 09:50:29 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1c scrub ok
Jan 23 09:50:30 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Jan 23 09:50:30 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Jan 23 09:50:31 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Jan 23 09:50:31 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Jan 23 09:50:32 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.12 scrub starts
Jan 23 09:50:32 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.12 scrub ok
Jan 23 09:50:33 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Jan 23 09:50:33 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Jan 23 09:50:34 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.11 deep-scrub starts
Jan 23 09:50:34 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.11 deep-scrub ok
Jan 23 09:50:35 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Jan 23 09:50:35 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Jan 23 09:50:36 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.12 scrub starts
Jan 23 09:50:36 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.12 scrub ok
Jan 23 09:50:37 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.17 deep-scrub starts
Jan 23 09:50:37 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.17 deep-scrub ok
Jan 23 09:50:38 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.11 scrub starts
Jan 23 09:50:38 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.11 scrub ok
Jan 23 09:50:39 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.15 scrub starts
Jan 23 09:50:39 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.15 scrub ok
Jan 23 09:50:39 compute-1 sudo[79781]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxhscbwqnmpqpwpvgtiaqoewyaxfgwzq ; /usr/bin/python3'
Jan 23 09:50:39 compute-1 sudo[79781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:50:39 compute-1 python3[79783]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a -f 'name=ceph-?(.*)-mgr.*' --format \{\{\.Command\}\} --no-trunc
                                           _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:50:39 compute-1 sudo[79781]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:40 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.f scrub starts
Jan 23 09:50:40 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.f scrub ok
Jan 23 09:50:41 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.b scrub starts
Jan 23 09:50:41 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.b scrub ok
Jan 23 09:50:42 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.16 scrub starts
Jan 23 09:50:42 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.16 scrub ok
Jan 23 09:50:43 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.5 deep-scrub starts
Jan 23 09:50:43 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.5 deep-scrub ok
Jan 23 09:50:43 compute-1 sshd-session[79797]: Invalid user sol from 45.148.10.240 port 47548
Jan 23 09:50:44 compute-1 sshd-session[79797]: Connection closed by invalid user sol 45.148.10.240 port 47548 [preauth]
Jan 23 09:50:44 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.0 scrub starts
Jan 23 09:50:44 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.0 scrub ok
Jan 23 09:50:45 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.5 scrub starts
Jan 23 09:50:45 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.5 scrub ok
Jan 23 09:50:45 compute-1 sudo[79799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:50:45 compute-1 sudo[79799]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:45 compute-1 sudo[79799]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:45 compute-1 sudo[79824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:45 compute-1 sudo[79824]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:46 compute-1 podman[79888]: 2026-01-23 09:50:46.112578396 +0000 UTC m=+0.038624331 container create 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 09:50:46 compute-1 systemd[72579]: Starting Mark boot as successful...
Jan 23 09:50:46 compute-1 systemd[1]: Started libpod-conmon-61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20.scope.
Jan 23 09:50:46 compute-1 systemd[72579]: Finished Mark boot as successful.
Jan 23 09:50:46 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:50:46 compute-1 podman[79888]: 2026-01-23 09:50:46.172433799 +0000 UTC m=+0.098479774 container init 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:50:46 compute-1 podman[79888]: 2026-01-23 09:50:46.180005861 +0000 UTC m=+0.106051796 container start 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 09:50:46 compute-1 podman[79888]: 2026-01-23 09:50:46.183467392 +0000 UTC m=+0.109513387 container attach 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:50:46 compute-1 determined_cray[79905]: 167 167
Jan 23 09:50:46 compute-1 systemd[1]: libpod-61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20.scope: Deactivated successfully.
Jan 23 09:50:46 compute-1 conmon[79905]: conmon 61ed6a2a4e3de4ff6fb9 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20.scope/container/memory.events
Jan 23 09:50:46 compute-1 podman[79888]: 2026-01-23 09:50:46.186915621 +0000 UTC m=+0.112961576 container died 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325)
Jan 23 09:50:46 compute-1 podman[79888]: 2026-01-23 09:50:46.096222758 +0000 UTC m=+0.022268713 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:46 compute-1 systemd[1]: var-lib-containers-storage-overlay-ba6b10ec9127ad71c246b54227d169f244d1e8b67c2aa1dd44fe8281e0da0aba-merged.mount: Deactivated successfully.
Jan 23 09:50:46 compute-1 podman[79888]: 2026-01-23 09:50:46.225187748 +0000 UTC m=+0.151233703 container remove 61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=determined_cray, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, CEPH_REF=squid)
Jan 23 09:50:46 compute-1 systemd[1]: libpod-conmon-61ed6a2a4e3de4ff6fb9d0d4b18e26215a2f685ee6d461e81e1cb931d1f20d20.scope: Deactivated successfully.
Jan 23 09:50:46 compute-1 podman[79921]: 2026-01-23 09:50:46.30084892 +0000 UTC m=+0.052746809 container create ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:50:46 compute-1 systemd[1]: Started libpod-conmon-ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f.scope.
Jan 23 09:50:46 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:50:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:46 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:46 compute-1 podman[79921]: 2026-01-23 09:50:46.275401678 +0000 UTC m=+0.027299657 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:46 compute-1 podman[79921]: 2026-01-23 09:50:46.386468078 +0000 UTC m=+0.138366057 container init ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid)
Jan 23 09:50:46 compute-1 podman[79921]: 2026-01-23 09:50:46.396051179 +0000 UTC m=+0.147949078 container start ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1)
Jan 23 09:50:46 compute-1 podman[79921]: 2026-01-23 09:50:46.400223394 +0000 UTC m=+0.152121323 container attach ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:50:46 compute-1 systemd[1]: libpod-ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f.scope: Deactivated successfully.
Jan 23 09:50:46 compute-1 podman[79921]: 2026-01-23 09:50:46.488765743 +0000 UTC m=+0.240663742 container died ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:50:46 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.7 scrub starts
Jan 23 09:50:46 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.7 scrub ok
Jan 23 09:50:46 compute-1 systemd[1]: var-lib-containers-storage-overlay-9479001acd07a3d01ba4d3d6828adbd7b3caf1edcb6e74b2ce9c75c211356323-merged.mount: Deactivated successfully.
Jan 23 09:50:46 compute-1 podman[79921]: 2026-01-23 09:50:46.540752215 +0000 UTC m=+0.292650104 container remove ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=upbeat_johnson, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default)
Jan 23 09:50:46 compute-1 systemd[1]: libpod-conmon-ba9d0f2b4d3f4bcc7db9067bcc56da71acee5e8fb673e22224e528ab90f4255f.scope: Deactivated successfully.
Jan 23 09:50:46 compute-1 systemd[1]: Reloading.
Jan 23 09:50:46 compute-1 systemd-sysv-generator[80006]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:46 compute-1 systemd-rc-local-generator[80001]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:46 compute-1 systemd[1]: Reloading.
Jan 23 09:50:46 compute-1 systemd-rc-local-generator[80048]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:46 compute-1 systemd-sysv-generator[80051]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:47 compute-1 systemd[1]: Starting Ceph mon.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:50:47 compute-1 podman[80106]: 2026-01-23 09:50:47.306073598 +0000 UTC m=+0.043630782 container create c1579b7599b2e571635ed3fae7a7ef35d7f3ef624019dac27d31923c7bd1f747 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-1, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:50:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e7173b5235c451bebb9606b15241deab18031289ccd1627f117cd452a9f7db/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e7173b5235c451bebb9606b15241deab18031289ccd1627f117cd452a9f7db/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e7173b5235c451bebb9606b15241deab18031289ccd1627f117cd452a9f7db/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:47 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e7173b5235c451bebb9606b15241deab18031289ccd1627f117cd452a9f7db/merged/var/lib/ceph/mon/ceph-compute-1 supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:47 compute-1 podman[80106]: 2026-01-23 09:50:47.368408119 +0000 UTC m=+0.105965313 container init c1579b7599b2e571635ed3fae7a7ef35d7f3ef624019dac27d31923c7bd1f747 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:50:47 compute-1 podman[80106]: 2026-01-23 09:50:47.375588147 +0000 UTC m=+0.113145321 container start c1579b7599b2e571635ed3fae7a7ef35d7f3ef624019dac27d31923c7bd1f747 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mon-compute-1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:50:47 compute-1 bash[80106]: c1579b7599b2e571635ed3fae7a7ef35d7f3ef624019dac27d31923c7bd1f747
Jan 23 09:50:47 compute-1 podman[80106]: 2026-01-23 09:50:47.286600924 +0000 UTC m=+0.024158138 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:47 compute-1 systemd[1]: Started Ceph mon.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:50:47 compute-1 ceph-mon[80126]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mon, pid 2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pidfile_write: ignore empty --pid-file
Jan 23 09:50:47 compute-1 ceph-mon[80126]: load: jerasure load: lrc 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: RocksDB version: 7.9.2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Git sha 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Compile date 2025-07-17 03:12:14
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: DB SUMMARY
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: DB Session ID:  PH7FUS34ITA44089QBF9
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: CURRENT file:  CURRENT
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: IDENTITY file:  IDENTITY
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: SST files in /var/lib/ceph/mon/ceph-compute-1/store.db dir, Total Num: 0, files: 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-compute-1/store.db: 000004.log size: 636 ; 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                         Options.error_if_exists: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                       Options.create_if_missing: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                         Options.paranoid_checks: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.flush_verify_memtable_count: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                                     Options.env: 0x563e77fecc20
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                                      Options.fs: PosixFileSystem
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                                Options.info_log: 0x563e79a7ba20
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.max_file_opening_threads: 16
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                              Options.statistics: (nil)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                               Options.use_fsync: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                       Options.max_log_file_size: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                   Options.log_file_time_to_roll: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                       Options.keep_log_file_num: 1000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                    Options.recycle_log_file_num: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                         Options.allow_fallocate: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                        Options.allow_mmap_reads: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                       Options.allow_mmap_writes: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                        Options.use_direct_reads: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:          Options.create_missing_column_families: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                              Options.db_log_dir: 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                                 Options.wal_dir: 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.table_cache_numshardbits: 6
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                         Options.WAL_ttl_seconds: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                       Options.WAL_size_limit_MB: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.manifest_preallocation_size: 4194304
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                     Options.is_fd_close_on_exec: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                   Options.advise_random_on_open: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                    Options.db_write_buffer_size: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                    Options.write_buffer_manager: 0x563e79a7f900
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.access_hint_on_compaction_start: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                      Options.use_adaptive_mutex: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                            Options.rate_limiter: (nil)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                       Options.wal_recovery_mode: 2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.enable_thread_tracking: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.enable_pipelined_write: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.unordered_write: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.write_thread_max_yield_usec: 100
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                               Options.row_cache: None
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                              Options.wal_filter: None
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.avoid_flush_during_recovery: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.allow_ingest_behind: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.two_write_queues: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.manual_wal_flush: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.wal_compression: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.atomic_flush: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                 Options.persist_stats_to_disk: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                 Options.write_dbid_to_manifest: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                 Options.log_readahead_size: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                 Options.best_efforts_recovery: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.allow_data_in_errors: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.db_host_id: __hostname__
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.enforce_single_del_contracts: true
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.max_background_jobs: 2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.max_background_compactions: -1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.max_subcompactions: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.delayed_write_rate : 16777216
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.max_total_wal_size: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                   Options.stats_dump_period_sec: 600
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                 Options.stats_persist_period_sec: 600
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                          Options.max_open_files: -1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                          Options.bytes_per_sync: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                      Options.wal_bytes_per_sync: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                   Options.strict_bytes_per_sync: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:       Options.compaction_readahead_size: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.max_background_flushes: -1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Compression algorithms supported:
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         kZSTD supported: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         kXpressCompression supported: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         kBZip2Compression supported: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         kZSTDNotFinalCompression supported: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         kLZ4Compression supported: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         kZlibCompression supported: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         kLZ4HCCompression supported: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         kSnappyCompression supported: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Fast CRC32 supported: Supported on x86
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: DMutex implementation: pthread_mutex_t
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:           Options.merge_operator: 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:        Options.compaction_filter: None
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:        Options.compaction_filter_factory: None
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:  Options.sst_partitioner_factory: None
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.memtable_factory: SkipListFactory
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:            Options.table_factory: BlockBasedTable
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x563e79a7a5c0)
                                             cache_index_and_filter_blocks: 1
                                             cache_index_and_filter_blocks_with_high_priority: 0
                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                             pin_top_level_index_and_filter: 1
                                             index_type: 0
                                             data_block_index_type: 0
                                             index_shortening: 1
                                             data_block_hash_table_util_ratio: 0.750000
                                             checksum: 4
                                             no_block_cache: 0
                                             block_cache: 0x563e79a9f350
                                             block_cache_name: BinnedLRUCache
                                             block_cache_options:
                                               capacity : 536870912
                                               num_shard_bits : 4
                                               strict_capacity_limit : 0
                                               high_pri_pool_ratio: 0.000
                                             block_cache_compressed: (nil)
                                             persistent_cache: (nil)
                                             block_size: 4096
                                             block_size_deviation: 10
                                             block_restart_interval: 16
                                             index_block_restart_interval: 1
                                             metadata_block_size: 4096
                                             partition_filters: 0
                                             use_delta_encoding: 1
                                             filter_policy: bloomfilter
                                             whole_key_filtering: 1
                                             verify_compression: 0
                                             read_amp_bytes_per_bit: 0
                                             format_version: 5
                                             enable_index_compression: 1
                                             block_align: 0
                                             max_auto_readahead_size: 262144
                                             prepopulate_block_cache: 0
                                             initial_auto_readahead_size: 8192
                                             num_file_reads_for_auto_readahead: 2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:        Options.write_buffer_size: 33554432
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:  Options.max_write_buffer_number: 2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:          Options.compression: NoCompression
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.bottommost_compression: Disabled
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:       Options.prefix_extractor: nullptr
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.num_levels: 7
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:            Options.compression_opts.window_bits: -14
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.compression_opts.level: 32767
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:               Options.compression_opts.strategy: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.compression_opts.parallel_threads: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                  Options.compression_opts.enabled: false
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:              Options.level0_stop_writes_trigger: 36
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                   Options.target_file_size_base: 67108864
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:             Options.target_file_size_multiplier: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                        Options.arena_block_size: 1048576
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.disable_auto_compactions: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                   Options.inplace_update_support: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                 Options.inplace_update_num_locks: 10000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:               Options.memtable_whole_key_filtering: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:   Options.memtable_huge_page_size: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                           Options.bloom_locality: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                    Options.max_successive_merges: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.optimize_filters_for_hits: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.paranoid_file_checks: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.force_consistency_checks: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.report_bg_io_stats: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                               Options.ttl: 2592000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:          Options.periodic_compaction_seconds: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:    Options.preserve_internal_time_seconds: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                       Options.enable_blob_files: false
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                           Options.min_blob_size: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                          Options.blob_file_size: 268435456
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                   Options.blob_compression_type: NoCompression
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:          Options.enable_blob_garbage_collection: false
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:          Options.blob_compaction_readahead_size: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb:                Options.blob_file_starting_level: 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-compute-1/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 1897ab4a-12ed-4850-8782-7d536e06cd96
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161847414070, "job": 1, "event": "recovery_started", "wal_files": [4]}
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161847416344, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1773, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 648, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 526, "raw_average_value_size": 105, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769161847416441, "job": 1, "event": "recovery_finished"}
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Jan 23 09:50:47 compute-1 sudo[79824]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x563e79aa0e00
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: DB pointer 0x563e79baa000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 09:50:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.73 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      1/0    1.73 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 0.0 total, 0.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.12 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.3e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1 does not exist in monmap, will attempt to join an existing cluster
Jan 23 09:50:47 compute-1 ceph-mon[80126]: using public_addr v2:192.168.122.101:0/0 -> [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0]
Jan 23 09:50:47 compute-1 ceph-mon[80126]: starting mon.compute-1 rank -1 at public addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] at bind addrs [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon_data /var/lib/ceph/mon/ceph-compute-1 fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(???) e0 preinit fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).mds e1 new map
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).mds e1 print_map
                                           e1
                                           btime 2026-01-23T09:47:38:565964+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: -1
                                            
                                           No filesystems configured
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e4 e4: 1 total, 0 up, 1 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e5 e5: 2 total, 0 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e6 e6: 2 total, 0 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e7 e7: 2 total, 0 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e8 e8: 2 total, 0 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e9 e9: 2 total, 0 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e10 e10: 2 total, 1 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e11 e11: 2 total, 1 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e12 e12: 2 total, 1 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e13 e13: 2 total, 1 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e14 e14: 2 total, 1 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e15 e15: 2 total, 1 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e16 e16: 2 total, 1 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e17 e17: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e18 e18: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e19 e19: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e20 e20: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e21 e21: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e22 e22: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e23 e23: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e24 e24: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e25 e25: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e26 e26: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e27 e27: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e28 e28: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e29 e29: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e30 e30: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 e31: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 3314933000852226048, adjusting msgr requires
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).osd e31 crush map has features 288514051259236352, adjusting msgr requires
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1144026165' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "backups", "app": "rbd"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e21: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v74: 7 pgs: 4 active+clean, 2 creating+peering, 1 unknown; 0 B data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e22: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1803776421' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1803776421' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "images", "app": "rbd"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e23: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mgrmap e9: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v77: 7 pgs: 7 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e24: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Health check update: 2 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2193766018' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2193766018' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.meta", "app": "cephfs"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e25: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.1f scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.1f scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2528169956' entity='client.admin' cmd=[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2528169956' entity='client.admin' cmd='[{"prefix": "osd pool application enable", "pool": "cephfs.cephfs.data", "app": "cephfs"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e26: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v80: 100 pgs: 2 peering, 93 unknown, 5 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1e scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1e scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.19 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.19 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1d scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1d scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.10 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.10 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v82: 162 pgs: 2 peering, 124 unknown, 36 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1f scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1f scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e27: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.1e deep-scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.1e deep-scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1c scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1c scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e28: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/29302298' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/29302298' entity='client.admin' cmd='[{"prefix": "config assimilate-conf"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.18 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.18 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v85: 193 pgs: 93 unknown, 100 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.7 deep-scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.7 deep-scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.17 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.17 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.9 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.9 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e29: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.16 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v87: 193 pgs: 93 unknown, 100 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.8 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.8 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.6 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.6 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.16 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.11 deep-scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v88: 193 pgs: 62 unknown, 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.a scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.a scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.11 deep-scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.15 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2695482257' entity='client.admin' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.2 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.2 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.15 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.12 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v89: 193 pgs: 62 unknown, 131 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.12 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.0 deep-scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.0 deep-scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.14225 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Saving service ingress.rgw.default spec with placement count:2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.11 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.11 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.4 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.4 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.14 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.14 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v90: 193 pgs: 1 active, 1 active+clean+scrubbing, 191 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.1 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.12 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.12 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.3 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.3 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "backups", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.data", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "cephfs.cephfs.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "images", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "vms", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "volumes", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e30: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.1f scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.1f scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e31: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v92: 193 pgs: 1 active, 1 active+clean+scrubbing, 191 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.1f scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.1f scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.14227 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Saving service node-exporter spec with placement *
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.19 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.1c scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.1c scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.19 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Saving service grafana spec with placement compute-0;count:1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v94: 193 pgs: 1 active, 192 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.1e scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.1e scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Saving service prometheus spec with placement compute-0;count:1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Saving service alertmanager spec with placement compute-0;count:1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.18 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.18 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.1b scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.1b scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.17 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.17 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.18 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.18 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1143624271' entity='client.admin' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v95: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.12 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.12 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 5.19 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 5.19 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.16 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.16 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v96: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.1b scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.1b scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.11 deep-scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.11 deep-scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3906855381' entity='client.admin' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.1c scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.1c scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.14 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.14 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 5.1d scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 5.1d scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v97: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.12 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.12 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2854364725' entity='client.admin' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.1f deep-scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.1f deep-scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.17 deep-scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.17 deep-scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v98: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Deploying daemon mon.compute-2 on compute-2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.1d scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.1d scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.11 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.11 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Health check cleared: CEPHADM_APPLY_SPEC_FAIL (was: Failed to apply 2 service(s): mon,mgr)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Cluster is now healthy
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2852887520' entity='client.admin' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.c scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.c scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v99: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.f scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.f scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-0 calling monitor election
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.f scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.f scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.b scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.b scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v100: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.3 deep-scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 4.3 deep-scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.16 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.16 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-2 calling monitor election
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.4 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 3.4 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.5 deep-scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.5 deep-scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v101: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.1 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.1 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.0 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 7.0 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.6 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 6.6 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: pgmap v102: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-0 is new leader, mons compute-0,compute-2 in quorum (ranks 0,1)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: monmap epoch 2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:47 compute-1 ceph-mon[80126]: last_changed 2026-01-23T09:50:40.551249+0000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: created 2026-01-23T09:47:35.499222+0000
Jan 23 09:50:47 compute-1 ceph-mon[80126]: min_mon_release 19 (squid)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: election_strategy: 1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Jan 23 09:50:47 compute-1 ceph-mon[80126]: fsmap 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: osdmap e31: 2 total, 2 up, 2 in
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mgrmap e9: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:47 compute-1 ceph-mon[80126]: overall HEALTH_OK
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.5 scrub starts
Jan 23 09:50:47 compute-1 ceph-mon[80126]: 2.5 scrub ok
Jan 23 09:50:47 compute-1 ceph-mon[80126]: Deploying daemon mon.compute-1 on compute-1
Jan 23 09:50:47 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:47 compute-1 ceph-mon[80126]: mon.compute-1@-1(synchronizing).paxosservice(auth 1..7) refresh upgraded, format 0 -> 3
Jan 23 09:50:47 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Jan 23 09:50:47 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Jan 23 09:50:48 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.d scrub starts
Jan 23 09:50:48 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.d scrub ok
Jan 23 09:50:49 compute-1 ceph-mon[80126]: mon.compute-1@-1(probing) e3  my rank is now 2 (was -1)
Jan 23 09:50:49 compute-1 ceph-mon[80126]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 23 09:50:49 compute-1 ceph-mon[80126]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Jan 23 09:50:49 compute-1 ceph-mon[80126]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:49 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.c scrub starts
Jan 23 09:50:49 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.c scrub ok
Jan 23 09:50:50 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1a deep-scrub starts
Jan 23 09:50:50 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 2.1a deep-scrub ok
Jan 23 09:50:51 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.19 scrub starts
Jan 23 09:50:51 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.19 scrub ok
Jan 23 09:50:52 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1a scrub starts
Jan 23 09:50:52 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 7.1a scrub ok
Jan 23 09:50:52 compute-1 ceph-mon[80126]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Jan 23 09:50:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout,16=squid ondisk layout}
Jan 23 09:50:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 09:50:52 compute-1 ceph-mon[80126]: mgrc update_daemon_metadata mon.compute-1 metadata {addrs=[v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0],arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=centos,distro_description=CentOS Stream 9,distro_version=9,hostname=compute-1,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,os=Linux}
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 3.2 scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 3.2 scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:50:53 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:50:53 compute-1 ceph-mon[80126]: mon.compute-0 calling monitor election
Jan 23 09:50:53 compute-1 ceph-mon[80126]: mon.compute-2 calling monitor election
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 4.4 scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 4.4 scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 7.d scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 7.d scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 4.6 scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 4.6 scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: pgmap v104: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:53 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-1 ceph-mon[80126]: mon.compute-1 calling monitor election
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 7.c scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 7.c scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 3.1 scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 3.1 scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 2.1a deep-scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 2.1a deep-scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 6.4 deep-scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 6.4 deep-scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 7.19 scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 7.19 scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: pgmap v105: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:53 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 4.2 scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 4.2 scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 7.1a scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 7.1a scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 6.0 scrub starts
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 6.0 scrub ok
Jan 23 09:50:53 compute-1 ceph-mon[80126]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 09:50:53 compute-1 ceph-mon[80126]: monmap epoch 3
Jan 23 09:50:53 compute-1 ceph-mon[80126]: fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:53 compute-1 ceph-mon[80126]: last_changed 2026-01-23T09:50:47.540109+0000
Jan 23 09:50:53 compute-1 ceph-mon[80126]: created 2026-01-23T09:47:35.499222+0000
Jan 23 09:50:53 compute-1 ceph-mon[80126]: min_mon_release 19 (squid)
Jan 23 09:50:53 compute-1 ceph-mon[80126]: election_strategy: 1
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Jan 23 09:50:53 compute-1 ceph-mon[80126]: 2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Jan 23 09:50:53 compute-1 ceph-mon[80126]: fsmap 
Jan 23 09:50:53 compute-1 ceph-mon[80126]: osdmap e31: 2 total, 2 up, 2 in
Jan 23 09:50:53 compute-1 ceph-mon[80126]: mgrmap e9: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:53 compute-1 ceph-mon[80126]: overall HEALTH_OK
Jan 23 09:50:53 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:53 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Jan 23 09:50:53 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Jan 23 09:50:54 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:54 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:54 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.uczrot", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 09:50:54 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-2.uczrot", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 09:50:54 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 09:50:54 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:54 compute-1 ceph-mon[80126]: Deploying daemon mgr.compute-2.uczrot on compute-2
Jan 23 09:50:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4282911488' entity='client.admin' 
Jan 23 09:50:54 compute-1 ceph-mon[80126]: 5.18 scrub starts
Jan 23 09:50:54 compute-1 ceph-mon[80126]: 5.18 scrub ok
Jan 23 09:50:54 compute-1 ceph-mon[80126]: pgmap v106: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:54 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:50:54 compute-1 ceph-mon[80126]: 5.3 scrub starts
Jan 23 09:50:54 compute-1 ceph-mon[80126]: 5.3 scrub ok
Jan 23 09:50:54 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.18 deep-scrub starts
Jan 23 09:50:54 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.18 deep-scrub ok
Jan 23 09:50:54 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 23 09:50:54 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_auth_request failed to assign global_id
Jan 23 09:50:55 compute-1 sudo[80165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:50:55 compute-1 sudo[80165]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:55 compute-1 sudo[80165]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:55 compute-1 sudo[80190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:50:55 compute-1 sudo[80190]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:50:55 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Jan 23 09:50:55 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.19 scrub ok
Jan 23 09:50:55 compute-1 podman[80254]: 2026-01-23 09:50:55.717584531 +0000 UTC m=+0.049702039 container create 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 23 09:50:55 compute-1 systemd[1]: Started libpod-conmon-94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a.scope.
Jan 23 09:50:55 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:50:55 compute-1 podman[80254]: 2026-01-23 09:50:55.696244045 +0000 UTC m=+0.028361603 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:55 compute-1 podman[80254]: 2026-01-23 09:50:55.794692789 +0000 UTC m=+0.126810307 container init 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 23 09:50:55 compute-1 podman[80254]: 2026-01-23 09:50:55.803043327 +0000 UTC m=+0.135160835 container start 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS)
Jan 23 09:50:55 compute-1 ecstatic_pascal[80270]: 167 167
Jan 23 09:50:55 compute-1 systemd[1]: libpod-94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a.scope: Deactivated successfully.
Jan 23 09:50:55 compute-1 podman[80254]: 2026-01-23 09:50:55.811859181 +0000 UTC m=+0.143976699 container attach 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:50:55 compute-1 podman[80254]: 2026-01-23 09:50:55.81277315 +0000 UTC m=+0.144890658 container died 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 09:50:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-b42a0ebdf80131d105ff0e320c043f62051da928516dd336d1618a6dabec38a4-merged.mount: Deactivated successfully.
Jan 23 09:50:55 compute-1 podman[80254]: 2026-01-23 09:50:55.854371217 +0000 UTC m=+0.186488715 container remove 94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ecstatic_pascal, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True)
Jan 23 09:50:55 compute-1 systemd[1]: libpod-conmon-94ea1d234a18d008eb19ec4362503b8298c9e9afe0cd9e5f6a8fb022fecb156a.scope: Deactivated successfully.
Jan 23 09:50:55 compute-1 systemd[1]: Reloading.
Jan 23 09:50:55 compute-1 systemd-sysv-generator[80317]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:55 compute-1 systemd-rc-local-generator[80314]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:56 compute-1 ceph-mon[80126]: 4.18 deep-scrub starts
Jan 23 09:50:56 compute-1 ceph-mon[80126]: 4.18 deep-scrub ok
Jan 23 09:50:56 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:56 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:56 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3189222711' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 23 09:50:56 compute-1 ceph-mon[80126]: 3.6 scrub starts
Jan 23 09:50:56 compute-1 ceph-mon[80126]: 3.6 scrub ok
Jan 23 09:50:56 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:56 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.jmakme", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 09:50:56 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.compute-1.jmakme", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Jan 23 09:50:56 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 09:50:56 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:56 compute-1 ceph-mon[80126]: Deploying daemon mgr.compute-1.jmakme on compute-1
Jan 23 09:50:56 compute-1 ceph-mon[80126]: pgmap v107: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:56 compute-1 systemd[1]: Reloading.
Jan 23 09:50:56 compute-1 systemd-rc-local-generator[80351]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:50:56 compute-1 systemd-sysv-generator[80355]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:50:56 compute-1 systemd[1]: Starting Ceph mgr.compute-1.jmakme for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:50:56 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Jan 23 09:50:56 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Jan 23 09:50:56 compute-1 podman[80413]: 2026-01-23 09:50:56.681372266 +0000 UTC m=+0.056997643 container create c38fbb9e0518ef0565602b46527dbad670c096b87a85b029bc0b62fffaa07da4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True)
Jan 23 09:50:56 compute-1 podman[80413]: 2026-01-23 09:50:56.648147108 +0000 UTC m=+0.023772555 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:50:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7277cbb2c7757e589b80f719d5ef942d0603f5cfc3d63ff0dec5284e7172e2e5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7277cbb2c7757e589b80f719d5ef942d0603f5cfc3d63ff0dec5284e7172e2e5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7277cbb2c7757e589b80f719d5ef942d0603f5cfc3d63ff0dec5284e7172e2e5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7277cbb2c7757e589b80f719d5ef942d0603f5cfc3d63ff0dec5284e7172e2e5/merged/var/lib/ceph/mgr/ceph-compute-1.jmakme supports timestamps until 2038 (0x7fffffff)
Jan 23 09:50:56 compute-1 podman[80413]: 2026-01-23 09:50:56.766970587 +0000 UTC m=+0.142596034 container init c38fbb9e0518ef0565602b46527dbad670c096b87a85b029bc0b62fffaa07da4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 09:50:56 compute-1 podman[80413]: 2026-01-23 09:50:56.784001615 +0000 UTC m=+0.159627032 container start c38fbb9e0518ef0565602b46527dbad670c096b87a85b029bc0b62fffaa07da4 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:50:56 compute-1 bash[80413]: c38fbb9e0518ef0565602b46527dbad670c096b87a85b029bc0b62fffaa07da4
Jan 23 09:50:56 compute-1 systemd[1]: Started Ceph mgr.compute-1.jmakme for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:50:56 compute-1 ceph-mgr[80432]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:50:56 compute-1 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:50:56 compute-1 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 09:50:56 compute-1 sudo[80190]: pam_unix(sudo:session): session closed for user root
Jan 23 09:50:56 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 09:50:56 compute-1 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:50:56 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 09:50:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:56.988+0000 7fcb174f0140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:50:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:57.081+0000 7fcb174f0140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:50:57 compute-1 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:50:57 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 09:50:57 compute-1 ceph-mon[80126]: 6.19 scrub starts
Jan 23 09:50:57 compute-1 ceph-mon[80126]: 6.19 scrub ok
Jan 23 09:50:57 compute-1 ceph-mon[80126]: 5.0 scrub starts
Jan 23 09:50:57 compute-1 ceph-mon[80126]: 5.0 scrub ok
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3189222711' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-1 ceph-mon[80126]: mgrmap e10: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:57 compute-1 ceph-mon[80126]: 3.1c scrub starts
Jan 23 09:50:57 compute-1 ceph-mon[80126]: 3.1c scrub ok
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-1 ceph-mon[80126]: 3.7 deep-scrub starts
Jan 23 09:50:57 compute-1 ceph-mon[80126]: 3.7 deep-scrub ok
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' 
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='mgr.14122 192.168.122.100:0/615021264' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:50:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1618362368' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 23 09:50:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1019916908 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:50:57 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1a scrub starts
Jan 23 09:50:57 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1a scrub ok
Jan 23 09:50:57 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 09:50:57 compute-1 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:50:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:57.942+0000 7fcb174f0140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:50:57 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 09:50:58 compute-1 ceph-mon[80126]: Deploying daemon crash.compute-2 on compute-2
Jan 23 09:50:58 compute-1 ceph-mon[80126]: pgmap v108: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:50:58 compute-1 ceph-mon[80126]: 6.1a scrub starts
Jan 23 09:50:58 compute-1 ceph-mon[80126]: 6.1a scrub ok
Jan 23 09:50:58 compute-1 sshd-session[72887]: Connection closed by 192.168.122.100 port 59098
Jan 23 09:50:58 compute-1 sshd-session[72831]: Connection closed by 192.168.122.100 port 59080
Jan 23 09:50:58 compute-1 sshd-session[72802]: Connection closed by 192.168.122.100 port 59064
Jan 23 09:50:58 compute-1 sshd-session[72858]: Connection closed by 192.168.122.100 port 59090
Jan 23 09:50:58 compute-1 sshd-session[72599]: Connection closed by 192.168.122.100 port 59006
Jan 23 09:50:58 compute-1 sshd-session[72773]: Connection closed by 192.168.122.100 port 59054
Jan 23 09:50:58 compute-1 sshd-session[72744]: Connection closed by 192.168.122.100 port 59042
Jan 23 09:50:58 compute-1 sshd-session[72715]: Connection closed by 192.168.122.100 port 59038
Jan 23 09:50:58 compute-1 sshd-session[72686]: Connection closed by 192.168.122.100 port 59028
Jan 23 09:50:58 compute-1 sshd-session[72657]: Connection closed by 192.168.122.100 port 59012
Jan 23 09:50:58 compute-1 sshd-session[72628]: Connection closed by 192.168.122.100 port 59010
Jan 23 09:50:58 compute-1 sshd-session[72597]: Connection closed by 192.168.122.100 port 58996
Jan 23 09:50:58 compute-1 sshd-session[72828]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 sshd-session[72575]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 sshd-session[72799]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 sshd-session[72625]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 sshd-session[72884]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 systemd[1]: session-30.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 sshd-session[72770]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 systemd[1]: session-28.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 sshd-session[72654]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 sshd-session[72855]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 sshd-session[72712]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 systemd[1]: session-31.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 systemd[1]: session-32.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 systemd[1]: session-32.scope: Consumed 1min 14.985s CPU time.
Jan 23 09:50:58 compute-1 systemd[1]: session-23.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 systemd[1]: session-24.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 sshd-session[72741]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 systemd[1]: session-20.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 systemd[1]: session-26.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 30 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 23 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd[1]: session-29.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 systemd[1]: session-27.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 28 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 32 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 31 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 26 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 24 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 20 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 27 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 29 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 sshd-session[72592]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 systemd[1]: session-22.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 sshd-session[72683]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:50:58 compute-1 systemd[1]: session-25.scope: Deactivated successfully.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 22 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Session 25 logged out. Waiting for processes to exit.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 30.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 28.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 31.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 32.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 23.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 24.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 20.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 26.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 29.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 27.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 22.
Jan 23 09:50:58 compute-1 systemd-logind[807]: Removed session 25.
Jan 23 09:50:58 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1a deep-scrub starts
Jan 23 09:50:58 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1a deep-scrub ok
Jan 23 09:50:58 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:50:58 compute-1 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:50:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:58.689+0000 7fcb174f0140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:50:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:50:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:   from numpy import show_config as show_numpy_config
Jan 23 09:50:58 compute-1 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 09:50:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:58.897+0000 7fcb174f0140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-1 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:50:58 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 09:50:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:58.990+0000 7fcb174f0140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:50:59 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 09:50:59 compute-1 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:50:59 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:50:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:50:59.166+0000 7fcb174f0140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:50:59 compute-1 ceph-mon[80126]: 4.0 scrub starts
Jan 23 09:50:59 compute-1 ceph-mon[80126]: 4.0 scrub ok
Jan 23 09:50:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1618362368' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 23 09:50:59 compute-1 ceph-mon[80126]: mgrmap e11: compute-0.nbdygh(active, since 2m)
Jan 23 09:50:59 compute-1 ceph-mon[80126]: 5.1a deep-scrub starts
Jan 23 09:50:59 compute-1 ceph-mon[80126]: 5.1a deep-scrub ok
Jan 23 09:50:59 compute-1 ceph-mon[80126]: 4.7 deep-scrub starts
Jan 23 09:50:59 compute-1 ceph-mon[80126]: 4.7 deep-scrub ok
Jan 23 09:50:59 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Jan 23 09:50:59 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Jan 23 09:50:59 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 09:50:59 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:50:59 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:51:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.296+0000 7fcb174f0140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:51:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.512+0000 7fcb174f0140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 09:51:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.606+0000 7fcb174f0140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:51:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.682+0000 7fcb174f0140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 09:51:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.767+0000 7fcb174f0140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:00 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 09:51:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:00.842+0000 7fcb174f0140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-1 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:51:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:01.210+0000 7fcb174f0140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-1 ceph-mon[80126]: 4.1b scrub starts
Jan 23 09:51:01 compute-1 ceph-mon[80126]: 4.1b scrub ok
Jan 23 09:51:01 compute-1 ceph-mon[80126]: 3.0 deep-scrub starts
Jan 23 09:51:01 compute-1 ceph-mon[80126]: 3.0 deep-scrub ok
Jan 23 09:51:01 compute-1 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 09:51:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:01.307+0000 7fcb174f0140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1d scrub starts
Jan 23 09:51:01 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 09:51:01 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1d scrub ok
Jan 23 09:51:01 compute-1 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:01 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 09:51:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:01.747+0000 7fcb174f0140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 09:51:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.299+0000 7fcb174f0140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:51:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.374+0000 7fcb174f0140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e31 _set_new_cache_sizes cache_size:1020052916 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:02 compute-1 ceph-mon[80126]: 5.1b scrub starts
Jan 23 09:51:02 compute-1 ceph-mon[80126]: 5.1b scrub ok
Jan 23 09:51:02 compute-1 ceph-mon[80126]: 5.6 scrub starts
Jan 23 09:51:02 compute-1 ceph-mon[80126]: 5.6 scrub ok
Jan 23 09:51:02 compute-1 ceph-mon[80126]: 3.1d scrub starts
Jan 23 09:51:02 compute-1 ceph-mon[80126]: 3.1d scrub ok
Jan 23 09:51:02 compute-1 ceph-mon[80126]: Standby manager daemon compute-2.uczrot started
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 09:51:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.455+0000 7fcb174f0140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Jan 23 09:51:02 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 09:51:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.626+0000 7fcb174f0140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 09:51:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.698+0000 7fcb174f0140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:02 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:51:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:02.848+0000 7fcb174f0140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 09:51:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.054+0000 7fcb174f0140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 09:51:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.319+0000 7fcb174f0140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.387+0000 7fcb174f0140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x560ee3ff8d00 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  1: '-n'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  2: 'mgr.compute-1.jmakme'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  3: '-f'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  4: '--setuser'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  5: 'ceph'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  6: '--setgroup'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  7: 'ceph'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr respawn  exe_path /proc/self/exe
Jan 23 09:51:03 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.d scrub starts
Jan 23 09:51:03 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.d scrub ok
Jan 23 09:51:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setuser ceph since I am not root
Jan 23 09:51:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setgroup ceph since I am not root
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 09:51:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.646+0000 7f5dcc91b140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 09:51:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:03.738+0000 7f5dcc91b140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:03 compute-1 ceph-mon[80126]: 5.c scrub starts
Jan 23 09:51:03 compute-1 ceph-mon[80126]: 5.c scrub ok
Jan 23 09:51:03 compute-1 ceph-mon[80126]: 4.1a scrub starts
Jan 23 09:51:03 compute-1 ceph-mon[80126]: 4.1a scrub ok
Jan 23 09:51:04 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 09:51:04 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1c deep-scrub starts
Jan 23 09:51:04 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1c deep-scrub ok
Jan 23 09:51:04 compute-1 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:04.525+0000 7f5dcc91b140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:04 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 09:51:04 compute-1 ceph-mon[80126]: 6.f scrub starts
Jan 23 09:51:04 compute-1 ceph-mon[80126]: 6.f scrub ok
Jan 23 09:51:04 compute-1 ceph-mon[80126]: 6.d scrub starts
Jan 23 09:51:04 compute-1 ceph-mon[80126]: 6.d scrub ok
Jan 23 09:51:04 compute-1 ceph-mon[80126]: mgrmap e12: compute-0.nbdygh(active, since 2m), standbys: compute-2.uczrot
Jan 23 09:51:04 compute-1 ceph-mon[80126]: 5.1c deep-scrub starts
Jan 23 09:51:04 compute-1 ceph-mon[80126]: 5.1c deep-scrub ok
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:05.161+0000 7f5dcc91b140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:51:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:51:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:51:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:   from numpy import show_config as show_numpy_config
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:05.324+0000 7f5dcc91b140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:05.395+0000 7f5dcc91b140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 09:51:05 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.e scrub starts
Jan 23 09:51:05 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.e scrub ok
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:05.541+0000 7f5dcc91b140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 09:51:05 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 e32: 2 total, 2 up, 2 in
Jan 23 09:51:05 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 09:51:06 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.c deep-scrub starts
Jan 23 09:51:06 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.c deep-scrub ok
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:51:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:06.540+0000 7f5dcc91b140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:06 compute-1 sshd-session[80496]: Accepted publickey for ceph-admin from 192.168.122.100 port 53730 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:51:06 compute-1 systemd-logind[807]: New session 33 of user ceph-admin.
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:51:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:06.767+0000 7f5dcc91b140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:06 compute-1 systemd[1]: Started Session 33 of User ceph-admin.
Jan 23 09:51:06 compute-1 sshd-session[80496]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:06.851+0000 7f5dcc91b140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 09:51:06 compute-1 sudo[80500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:06 compute-1 sudo[80500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:06 compute-1 sudo[80500]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:06 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:51:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:06.919+0000 7f5dcc91b140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:06 compute-1 sudo[80525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:51:06 compute-1 sudo[80525]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 09:51:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.003+0000 7f5dcc91b140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 09:51:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.076+0000 7f5dcc91b140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-1 ceph-mon[80126]: 3.b scrub starts
Jan 23 09:51:07 compute-1 ceph-mon[80126]: 3.b scrub ok
Jan 23 09:51:07 compute-1 ceph-mon[80126]: 5.d scrub starts
Jan 23 09:51:07 compute-1 ceph-mon[80126]: 5.d scrub ok
Jan 23 09:51:07 compute-1 ceph-mon[80126]: 6.e scrub starts
Jan 23 09:51:07 compute-1 ceph-mon[80126]: 6.e scrub ok
Jan 23 09:51:07 compute-1 ceph-mon[80126]: Active manager daemon compute-0.nbdygh restarted
Jan 23 09:51:07 compute-1 ceph-mon[80126]: Activating manager daemon compute-0.nbdygh
Jan 23 09:51:07 compute-1 ceph-mon[80126]: osdmap e32: 2 total, 2 up, 2 in
Jan 23 09:51:07 compute-1 ceph-mon[80126]: mgrmap e13: compute-0.nbdygh(active, starting, since 0.407511s), standbys: compute-2.uczrot
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-0.nbdygh", "id": "compute-0.nbdygh"}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-2.uczrot", "id": "compute-2.uczrot"}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:51:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.428+0000 7f5dcc91b140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 _set_new_cache_sizes cache_size:1020054705 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:07 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.e scrub starts
Jan 23 09:51:07 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.e scrub ok
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.526+0000 7f5dcc91b140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 09:51:07 compute-1 podman[80621]: 2026-01-23 09:51:07.607599594 +0000 UTC m=+0.070749796 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:51:07 compute-1 podman[80621]: 2026-01-23 09:51:07.704037882 +0000 UTC m=+0.167188114 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True)
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:07 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 09:51:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:07.940+0000 7f5dcc91b140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 sudo[80525]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:08 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.f scrub starts
Jan 23 09:51:08 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.f scrub ok
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.483+0000 7f5dcc91b140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.563+0000 7f5dcc91b140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:51:08 compute-1 ceph-mon[80126]: 5.a scrub starts
Jan 23 09:51:08 compute-1 ceph-mon[80126]: 5.a scrub ok
Jan 23 09:51:08 compute-1 ceph-mon[80126]: Manager daemon compute-0.nbdygh is now available
Jan 23 09:51:08 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 09:51:08 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 09:51:08 compute-1 ceph-mon[80126]: 4.c deep-scrub starts
Jan 23 09:51:08 compute-1 ceph-mon[80126]: 4.c deep-scrub ok
Jan 23 09:51:08 compute-1 ceph-mon[80126]: 6.9 deep-scrub starts
Jan 23 09:51:08 compute-1 ceph-mon[80126]: 6.9 deep-scrub ok
Jan 23 09:51:08 compute-1 ceph-mon[80126]: 4.e scrub starts
Jan 23 09:51:08 compute-1 ceph-mon[80126]: 4.e scrub ok
Jan 23 09:51:08 compute-1 ceph-mon[80126]: mgrmap e14: compute-0.nbdygh(active, since 2s), standbys: compute-2.uczrot
Jan 23 09:51:08 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 09:51:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.646+0000 7f5dcc91b140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.798+0000 7f5dcc91b140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 09:51:08 compute-1 sudo[80707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:08 compute-1 sudo[80707]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:08 compute-1 sudo[80707]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:08.875+0000 7f5dcc91b140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:08 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 09:51:08 compute-1 sudo[80732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:51:08 compute-1 sudo[80732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:09.046+0000 7f5dcc91b140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 09:51:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:09.275+0000 7f5dcc91b140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.e scrub starts
Jan 23 09:51:09 compute-1 sudo[80732]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:09 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.e scrub ok
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 09:51:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:09.535+0000 7f5dcc91b140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-1 sudo[80787]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:09 compute-1 sudo[80787]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:09 compute-1 sudo[80787]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:09.613+0000 7f5dcc91b140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: mgr load Constructed class from module: dashboard
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x564cc1832d00 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: [dashboard INFO root] Starting engine...
Jan 23 09:51:09 compute-1 sudo[80812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 09:51:09 compute-1 sudo[80812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:09 compute-1 ceph-mgr[80432]: [dashboard INFO root] Engine started...
Jan 23 09:51:09 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Bus STARTING
Jan 23 09:51:09 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Serving on http://192.168.122.100:8765
Jan 23 09:51:09 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Serving on https://192.168.122.100:7150
Jan 23 09:51:09 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Bus STARTED
Jan 23 09:51:09 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:07] ENGINE Client ('192.168.122.100', 55612) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 09:51:09 compute-1 ceph-mon[80126]: 4.b deep-scrub starts
Jan 23 09:51:09 compute-1 ceph-mon[80126]: 4.b deep-scrub ok
Jan 23 09:51:09 compute-1 ceph-mon[80126]: pgmap v4: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:09 compute-1 ceph-mon[80126]: 5.f scrub starts
Jan 23 09:51:09 compute-1 ceph-mon[80126]: 5.f scrub ok
Jan 23 09:51:09 compute-1 ceph-mon[80126]: from='client.14304 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-password", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:09 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:09 compute-1 sudo[80812]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[80867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:51:10 compute-1 sudo[80867]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[80867]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[80892]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:51:10 compute-1 sudo[80892]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[80892]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[80917]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:10 compute-1 sudo[80917]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[80917]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[80942]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:10 compute-1 sudo[80942]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[80942]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[80967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:10 compute-1 sudo[80967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[80967]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[81015]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:10 compute-1 sudo[81015]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[81015]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[81040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:10 compute-1 sudo[81040]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[81040]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1a scrub starts
Jan 23 09:51:10 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.1a scrub ok
Jan 23 09:51:10 compute-1 sudo[81065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 23 09:51:10 compute-1 sudo[81065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[81065]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[81090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:10 compute-1 sudo[81090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[81090]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[81115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:10 compute-1 sudo[81115]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[81115]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[81140]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:10 compute-1 sudo[81140]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[81140]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[81165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:10 compute-1 sudo[81165]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[81165]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:10 compute-1 sudo[81190]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:10 compute-1 sudo[81190]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:10 compute-1 sudo[81190]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 sudo[81238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:11 compute-1 sudo[81238]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81238]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 sudo[81263]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:11 compute-1 sudo[81263]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81263]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 sudo[81288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:11 compute-1 sudo[81288]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81288]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 sudo[81313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:51:11 compute-1 sudo[81313]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81313]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1 scrub starts
Jan 23 09:51:11 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.1 scrub ok
Jan 23 09:51:11 compute-1 sudo[81338]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:51:11 compute-1 sudo[81338]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81338]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 sudo[81363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-1 sudo[81363]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81363]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 sudo[81388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:11 compute-1 sudo[81388]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81388]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 sudo[81413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-1 sudo[81413]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81413]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 sudo[81461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-1 sudo[81461]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81461]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:11 compute-1 sudo[81486]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:11 compute-1 sudo[81486]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:11 compute-1 sudo[81486]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-1 sudo[81511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:12 compute-1 sudo[81511]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-1 sudo[81511]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-1 sudo[81536]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:12 compute-1 sudo[81536]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-1 sudo[81536]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-1 sudo[81561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:12 compute-1 sudo[81561]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-1 sudo[81561]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-1 sudo[81586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:12 compute-1 sudo[81586]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-1 sudo[81586]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-1 sudo[81611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:12 compute-1 sudo[81611]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-1 sudo[81611]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-1 sudo[81636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:12 compute-1 sudo[81636]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:12 compute-1 sudo[81636]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Jan 23 09:51:12 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Jan 23 09:51:12 compute-1 sudo[81684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:12 compute-1 sudo[81684]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-1 sudo[81684]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-1 sudo[81709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:12 compute-1 sudo[81709]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-1 sudo[81709]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:12 compute-1 sudo[81734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:12 compute-1 sudo[81734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:12 compute-1 sudo[81734]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:13 compute-1 ceph-mon[80126]: 5.b scrub starts
Jan 23 09:51:13 compute-1 ceph-mon[80126]: 5.b scrub ok
Jan 23 09:51:13 compute-1 ceph-mon[80126]: 5.e scrub starts
Jan 23 09:51:13 compute-1 ceph-mon[80126]: 5.e scrub ok
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-1 ceph-mon[80126]: mgrmap e15: compute-0.nbdygh(active, since 4s), standbys: compute-2.uczrot
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Standby manager daemon compute-1.jmakme started
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Adjusting osd_memory_target on compute-0 to 127.9M
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Unable to set osd_memory_target on compute-0 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Adjusting osd_memory_target on compute-1 to 127.9M
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Unable to set osd_memory_target on compute-1 to 134214860: error parsing value: Value '134214860' is below minimum 939524096
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:13 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 09:51:13 compute-1 ceph-mon[80126]: pgmap v5: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:13 compute-1 ceph-mon[80126]: 3.1a scrub starts
Jan 23 09:51:13 compute-1 ceph-mon[80126]: 3.1a scrub ok
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Standby manager daemon compute-2.uczrot restarted
Jan 23 09:51:13 compute-1 ceph-mon[80126]: Standby manager daemon compute-2.uczrot started
Jan 23 09:51:13 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Jan 23 09:51:13 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 5.8 deep-scrub starts
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 5.8 deep-scrub ok
Jan 23 09:51:14 compute-1 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:14 compute-1 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:14 compute-1 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='client.14310 -' entity='client.admin' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://192.168.122.100:9093", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:14 compute-1 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 6.b scrub starts
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 6.b scrub ok
Jan 23 09:51:14 compute-1 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:14 compute-1 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:14 compute-1 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 4.1 scrub starts
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 4.1 scrub ok
Jan 23 09:51:14 compute-1 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:14 compute-1 ceph-mon[80126]: pgmap v6: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 4.17 scrub starts
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 4.17 scrub ok
Jan 23 09:51:14 compute-1 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 6.3 scrub starts
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 6.3 scrub ok
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 4.16 scrub starts
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 4.16 scrub ok
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-1 ceph-mon[80126]: mgrmap e16: compute-0.nbdygh(active, since 7s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-1.jmakme", "id": "compute-1.jmakme"}]: dispatch
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 6.2 scrub starts
Jan 23 09:51:14 compute-1 ceph-mon[80126]: 6.2 scrub ok
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:14 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.5 deep-scrub starts
Jan 23 09:51:14 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.5 deep-scrub ok
Jan 23 09:51:15 compute-1 ceph-mon[80126]: Deploying daemon node-exporter.compute-0 on compute-0
Jan 23 09:51:15 compute-1 ceph-mon[80126]: 5.17 scrub starts
Jan 23 09:51:15 compute-1 ceph-mon[80126]: 5.17 scrub ok
Jan 23 09:51:15 compute-1 ceph-mon[80126]: pgmap v7: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail; 28 KiB/s rd, 0 B/s wr, 11 op/s
Jan 23 09:51:15 compute-1 ceph-mon[80126]: from='client.14316 -' entity='client.admin' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://192.168.122.100:9092", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:15 compute-1 ceph-mon[80126]: 6.5 deep-scrub starts
Jan 23 09:51:15 compute-1 ceph-mon[80126]: 6.5 deep-scrub ok
Jan 23 09:51:15 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Jan 23 09:51:15 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Jan 23 09:51:16 compute-1 ceph-mon[80126]: 6.14 scrub starts
Jan 23 09:51:16 compute-1 ceph-mon[80126]: 6.14 scrub ok
Jan 23 09:51:16 compute-1 ceph-mon[80126]: from='client.14322 -' entity='client.admin' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "http://192.168.122.100:3100", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:16 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:16 compute-1 ceph-mon[80126]: 5.1 scrub starts
Jan 23 09:51:16 compute-1 ceph-mon[80126]: 5.1 scrub ok
Jan 23 09:51:16 compute-1 ceph-mon[80126]: pgmap v8: 193 pgs: 193 active+clean; 449 KiB data, 53 MiB used, 40 GiB / 40 GiB avail; 21 KiB/s rd, 0 B/s wr, 8 op/s
Jan 23 09:51:16 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Jan 23 09:51:16 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Jan 23 09:51:16 compute-1 sudo[81759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:16 compute-1 sudo[81759]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:16 compute-1 sudo[81759]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:16 compute-1 sudo[81784]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/prometheus/node-exporter:v1.7.0 --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:16 compute-1 sudo[81784]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:17 compute-1 systemd[1]: Reloading.
Jan 23 09:51:17 compute-1 systemd-rc-local-generator[81878]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:51:17 compute-1 systemd-sysv-generator[81882]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:51:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:17 compute-1 systemd[1]: Reloading.
Jan 23 09:51:17 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Jan 23 09:51:17 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Jan 23 09:51:17 compute-1 systemd-rc-local-generator[81916]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:51:17 compute-1 systemd-sysv-generator[81922]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:51:17 compute-1 ceph-mon[80126]: 5.14 scrub starts
Jan 23 09:51:17 compute-1 ceph-mon[80126]: 5.14 scrub ok
Jan 23 09:51:17 compute-1 ceph-mon[80126]: 3.5 scrub starts
Jan 23 09:51:17 compute-1 ceph-mon[80126]: 3.5 scrub ok
Jan 23 09:51:17 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:17 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:17 compute-1 ceph-mon[80126]: from='mgr.14268 192.168.122.100:0/3010064577' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:17 compute-1 ceph-mon[80126]: Deploying daemon node-exporter.compute-1 on compute-1
Jan 23 09:51:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1110789864' entity='client.admin' cmd=[{"prefix": "mgr module disable", "module": "dashboard"}]: dispatch
Jan 23 09:51:17 compute-1 ceph-mgr[80432]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 09:51:17 compute-1 ceph-mgr[80432]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 09:51:17 compute-1 ceph-mgr[80432]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 09:51:17 compute-1 ceph-mgr[80432]: mgr respawn  1: '-n'
Jan 23 09:51:17 compute-1 ceph-mgr[80432]: mgr respawn  2: 'mgr.compute-1.jmakme'
Jan 23 09:51:17 compute-1 ceph-mgr[80432]: mgr respawn  3: '-f'
Jan 23 09:51:17 compute-1 sshd-session[80499]: Connection closed by 192.168.122.100 port 53730
Jan 23 09:51:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setuser ceph since I am not root
Jan 23 09:51:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setgroup ceph since I am not root
Jan 23 09:51:17 compute-1 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:51:17 compute-1 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 09:51:17 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 09:51:17 compute-1 sshd-session[80496]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:51:17 compute-1 systemd[1]: Starting Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:51:17 compute-1 systemd-logind[807]: Session 33 logged out. Waiting for processes to exit.
Jan 23 09:51:18 compute-1 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:18 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 09:51:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:18.035+0000 7f013d384140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:18 compute-1 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:18.117+0000 7f013d384140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:18 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 09:51:18 compute-1 bash[81996]: Trying to pull quay.io/prometheus/node-exporter:v1.7.0...
Jan 23 09:51:18 compute-1 bash[81996]: Getting image source signatures
Jan 23 09:51:18 compute-1 bash[81996]: Copying blob sha256:324153f2810a9927fcce320af9e4e291e0b6e805cbdd1f338386c756b9defa24
Jan 23 09:51:18 compute-1 bash[81996]: Copying blob sha256:2abcce694348cd2c949c0e98a7400ebdfd8341021bcf6b541bc72033ce982510
Jan 23 09:51:18 compute-1 bash[81996]: Copying blob sha256:455fd88e5221bc1e278ef2d059cd70e4df99a24e5af050ede621534276f6cf9a
Jan 23 09:51:18 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.3 scrub starts
Jan 23 09:51:18 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.3 scrub ok
Jan 23 09:51:18 compute-1 ceph-mon[80126]: 6.16 deep-scrub starts
Jan 23 09:51:18 compute-1 ceph-mon[80126]: 6.16 deep-scrub ok
Jan 23 09:51:18 compute-1 ceph-mon[80126]: 5.2 scrub starts
Jan 23 09:51:18 compute-1 ceph-mon[80126]: 5.2 scrub ok
Jan 23 09:51:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1110789864' entity='client.admin' cmd='[{"prefix": "mgr module disable", "module": "dashboard"}]': finished
Jan 23 09:51:18 compute-1 ceph-mon[80126]: mgrmap e17: compute-0.nbdygh(active, since 12s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/435334493' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch
Jan 23 09:51:18 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 09:51:19 compute-1 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:19.032+0000 7f013d384140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-1 bash[81996]: Copying config sha256:72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e
Jan 23 09:51:19 compute-1 bash[81996]: Writing manifest to image destination
Jan 23 09:51:19 compute-1 podman[81996]: 2026-01-23 09:51:19.310044666 +0000 UTC m=+1.139420861 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Jan 23 09:51:19 compute-1 podman[81996]: 2026-01-23 09:51:19.331257138 +0000 UTC m=+1.160633293 container create 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:51:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a9cf7871bcaed94d55bf97d20a317e95aa8ecd54623be987a412d3816ee0ab4/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Jan 23 09:51:19 compute-1 podman[81996]: 2026-01-23 09:51:19.397726994 +0000 UTC m=+1.227103139 container init 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:51:19 compute-1 podman[81996]: 2026-01-23 09:51:19.407257301 +0000 UTC m=+1.236633446 container start 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:51:19 compute-1 bash[81996]: 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6
Jan 23 09:51:19 compute-1 systemd[1]: Started Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.420Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.420Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.422Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.422Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.422Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.422Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=arp
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=bcache
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=bonding
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=cpu
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=dmi
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=edac
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=entropy
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=filefd
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=hwmon
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=netclass
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=netdev
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=netstat
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=nfs
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=nvme
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=os
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=pressure
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=rapl
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=selinux
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=softnet
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=stat
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=textfile
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=thermal_zone
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=time
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=uname
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=xfs
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.425Z caller=node_exporter.go:117 level=info collector=zfs
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.427Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[82083]: ts=2026-01-23T09:51:19.427Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Jan 23 09:51:19 compute-1 sudo[81784]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:19 compute-1 systemd[1]: session-33.scope: Deactivated successfully.
Jan 23 09:51:19 compute-1 systemd[1]: session-33.scope: Consumed 6.172s CPU time.
Jan 23 09:51:19 compute-1 systemd-logind[807]: Removed session 33.
Jan 23 09:51:19 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:51:19 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Jan 23 09:51:19 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Jan 23 09:51:19 compute-1 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:19.711+0000 7f013d384140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:   from numpy import show_config as show_numpy_config
Jan 23 09:51:19 compute-1 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:19.882+0000 7f013d384140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 09:51:19 compute-1 ceph-mon[80126]: 5.12 scrub starts
Jan 23 09:51:19 compute-1 ceph-mon[80126]: 5.12 scrub ok
Jan 23 09:51:19 compute-1 ceph-mon[80126]: 3.3 scrub starts
Jan 23 09:51:19 compute-1 ceph-mon[80126]: 3.3 scrub ok
Jan 23 09:51:19 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/435334493' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished
Jan 23 09:51:19 compute-1 ceph-mon[80126]: mgrmap e18: compute-0.nbdygh(active, since 13s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:19 compute-1 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:19 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 09:51:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:19.954+0000 7f013d384140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:20 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 09:51:20 compute-1 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:20.098+0000 7f013d384140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:20 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:51:20 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 09:51:20 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:51:20 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.4 deep-scrub starts
Jan 23 09:51:20 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.4 deep-scrub ok
Jan 23 09:51:20 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 09:51:20 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 09:51:21 compute-1 ceph-mon[80126]: 6.11 deep-scrub starts
Jan 23 09:51:21 compute-1 ceph-mon[80126]: 6.11 deep-scrub ok
Jan 23 09:51:21 compute-1 ceph-mon[80126]: 4.5 scrub starts
Jan 23 09:51:21 compute-1 ceph-mon[80126]: 4.5 scrub ok
Jan 23 09:51:21 compute-1 ceph-mon[80126]: 5.4 deep-scrub starts
Jan 23 09:51:21 compute-1 ceph-mon[80126]: 5.4 deep-scrub ok
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:51:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.149+0000 7f013d384140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.380+0000 7f013d384140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 09:51:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.497+0000 7f013d384140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:51:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.577+0000 7f013d384140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.9 scrub starts
Jan 23 09:51:21 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.9 scrub ok
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 09:51:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.659+0000 7f013d384140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:21 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 09:51:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:21.736+0000 7f013d384140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-1 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:51:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:22.083+0000 7f013d384140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-1 ceph-mon[80126]: 5.13 scrub starts
Jan 23 09:51:22 compute-1 ceph-mon[80126]: 5.13 scrub ok
Jan 23 09:51:22 compute-1 ceph-mon[80126]: 6.10 scrub starts
Jan 23 09:51:22 compute-1 ceph-mon[80126]: 6.10 scrub ok
Jan 23 09:51:22 compute-1 ceph-mon[80126]: 3.9 scrub starts
Jan 23 09:51:22 compute-1 ceph-mon[80126]: 3.9 scrub ok
Jan 23 09:51:22 compute-1 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 09:51:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:22.181+0000 7f013d384140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 09:51:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e32 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:22 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Jan 23 09:51:22 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Jan 23 09:51:22 compute-1 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:22 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 09:51:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:22.636+0000 7f013d384140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mon[80126]: 6.13 scrub starts
Jan 23 09:51:23 compute-1 ceph-mon[80126]: 6.13 scrub ok
Jan 23 09:51:23 compute-1 ceph-mon[80126]: 5.7 scrub starts
Jan 23 09:51:23 compute-1 ceph-mon[80126]: 5.7 scrub ok
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.198+0000 7f013d384140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:51:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.267+0000 7f013d384140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 09:51:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.342+0000 7f013d384140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.488+0000 7f013d384140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 09:51:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.572+0000 7f013d384140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Jan 23 09:51:23 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:51:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.735+0000 7f013d384140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:23.951+0000 7f013d384140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:23 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 09:51:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:24.218+0000 7f013d384140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:24.283+0000 7f013d384140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-1 ceph-mon[80126]: 5.1e scrub starts
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x559728edd860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 09:51:24 compute-1 ceph-mon[80126]: 5.1e scrub ok
Jan 23 09:51:24 compute-1 ceph-mon[80126]: 6.8 scrub starts
Jan 23 09:51:24 compute-1 ceph-mon[80126]: 6.8 scrub ok
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  1: '-n'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  2: 'mgr.compute-1.jmakme'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  3: '-f'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  4: '--setuser'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  5: 'ceph'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  6: '--setgroup'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  7: 'ceph'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 09:51:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setuser ceph since I am not root
Jan 23 09:51:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setgroup ceph since I am not root
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 09:51:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:24.525+0000 7fe273115140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 09:51:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:24.618+0000 7fe273115140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:51:24 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 09:51:24 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.a scrub starts
Jan 23 09:51:24 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.a scrub ok
Jan 23 09:51:25 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e33 e33: 2 total, 2 up, 2 in
Jan 23 09:51:25 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 09:51:25 compute-1 ceph-mon[80126]: 6.1d scrub starts
Jan 23 09:51:25 compute-1 ceph-mon[80126]: 6.1d scrub ok
Jan 23 09:51:25 compute-1 ceph-mon[80126]: Standby manager daemon compute-1.jmakme restarted
Jan 23 09:51:25 compute-1 ceph-mon[80126]: Standby manager daemon compute-1.jmakme started
Jan 23 09:51:25 compute-1 ceph-mon[80126]: 4.a scrub starts
Jan 23 09:51:25 compute-1 ceph-mon[80126]: 4.a scrub ok
Jan 23 09:51:25 compute-1 ceph-mon[80126]: Standby manager daemon compute-2.uczrot restarted
Jan 23 09:51:25 compute-1 ceph-mon[80126]: Standby manager daemon compute-2.uczrot started
Jan 23 09:51:25 compute-1 ceph-mon[80126]: Active manager daemon compute-0.nbdygh restarted
Jan 23 09:51:25 compute-1 ceph-mon[80126]: Activating manager daemon compute-0.nbdygh
Jan 23 09:51:25 compute-1 ceph-mon[80126]: osdmap e33: 2 total, 2 up, 2 in
Jan 23 09:51:25 compute-1 ceph-mon[80126]: mgrmap e19: compute-0.nbdygh(active, starting, since 0.060897s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:25.410+0000 7fe273115140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:25 compute-1 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:51:25 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 09:51:25 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.7 scrub starts
Jan 23 09:51:25 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.7 scrub ok
Jan 23 09:51:25 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:51:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:26.012+0000 7fe273115140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:51:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:51:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:51:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:   from numpy import show_config as show_numpy_config
Jan 23 09:51:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:26.161+0000 7fe273115140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 09:51:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:26.231+0000 7fe273115140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 09:51:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:26.381+0000 7fe273115140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:51:26 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.d scrub starts
Jan 23 09:51:26 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.d scrub ok
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 09:51:26 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:51:26 compute-1 ceph-mon[80126]: 7.1b scrub starts
Jan 23 09:51:26 compute-1 ceph-mon[80126]: 7.1b scrub ok
Jan 23 09:51:26 compute-1 ceph-mon[80126]: 6.7 scrub starts
Jan 23 09:51:26 compute-1 ceph-mon[80126]: 6.7 scrub ok
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 09:51:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.335+0000 7fe273115140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:51:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.544+0000 7fe273115140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:51:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.613+0000 7fe273115140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 09:51:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.678+0000 7fe273115140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:51:27 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.c scrub starts
Jan 23 09:51:27 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.c scrub ok
Jan 23 09:51:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.754+0000 7fe273115140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 09:51:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:27.823+0000 7fe273115140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:51:27 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 09:51:27 compute-1 ceph-mon[80126]: 7.18 scrub starts
Jan 23 09:51:27 compute-1 ceph-mon[80126]: 7.18 scrub ok
Jan 23 09:51:27 compute-1 ceph-mon[80126]: 4.d scrub starts
Jan 23 09:51:27 compute-1 ceph-mon[80126]: 4.d scrub ok
Jan 23 09:51:27 compute-1 ceph-mon[80126]: 2.1b scrub starts
Jan 23 09:51:27 compute-1 ceph-mon[80126]: 2.1b scrub ok
Jan 23 09:51:27 compute-1 ceph-mon[80126]: 3.c scrub starts
Jan 23 09:51:27 compute-1 ceph-mon[80126]: 3.c scrub ok
Jan 23 09:51:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:28.174+0000 7fe273115140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-1 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:51:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:28.279+0000 7fe273115140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-1 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 09:51:28 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 09:51:28 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.d deep-scrub starts
Jan 23 09:51:28 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.d deep-scrub ok
Jan 23 09:51:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:28.695+0000 7fe273115140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-1 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:51:28 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 09:51:29 compute-1 ceph-mon[80126]: 7.6 scrub starts
Jan 23 09:51:29 compute-1 ceph-mon[80126]: 7.6 scrub ok
Jan 23 09:51:29 compute-1 ceph-mon[80126]: 3.d deep-scrub starts
Jan 23 09:51:29 compute-1 ceph-mon[80126]: 3.d deep-scrub ok
Jan 23 09:51:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.247+0000 7fe273115140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 09:51:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.321+0000 7fe273115140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:51:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.411+0000 7fe273115140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 09:51:29 compute-1 systemd[1]: Stopping User Manager for UID 42477...
Jan 23 09:51:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.573+0000 7fe273115140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 systemd[72579]: Activating special unit Exit the Session...
Jan 23 09:51:29 compute-1 systemd[72579]: Stopped target Main User Target.
Jan 23 09:51:29 compute-1 systemd[72579]: Stopped target Basic System.
Jan 23 09:51:29 compute-1 systemd[72579]: Stopped target Paths.
Jan 23 09:51:29 compute-1 systemd[72579]: Stopped target Sockets.
Jan 23 09:51:29 compute-1 systemd[72579]: Stopped target Timers.
Jan 23 09:51:29 compute-1 systemd[72579]: Stopped Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:51:29 compute-1 systemd[72579]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 09:51:29 compute-1 systemd[72579]: Closed D-Bus User Message Bus Socket.
Jan 23 09:51:29 compute-1 systemd[72579]: Stopped Create User's Volatile Files and Directories.
Jan 23 09:51:29 compute-1 systemd[72579]: Removed slice User Application Slice.
Jan 23 09:51:29 compute-1 systemd[72579]: Reached target Shutdown.
Jan 23 09:51:29 compute-1 systemd[72579]: Finished Exit the Session.
Jan 23 09:51:29 compute-1 systemd[72579]: Reached target Exit the Session.
Jan 23 09:51:29 compute-1 systemd[1]: user@42477.service: Deactivated successfully.
Jan 23 09:51:29 compute-1 systemd[1]: Stopped User Manager for UID 42477.
Jan 23 09:51:29 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/42477...
Jan 23 09:51:29 compute-1 systemd[1]: run-user-42477.mount: Deactivated successfully.
Jan 23 09:51:29 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Jan 23 09:51:29 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Jan 23 09:51:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.658+0000 7fe273115140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 09:51:29 compute-1 systemd[1]: user-runtime-dir@42477.service: Deactivated successfully.
Jan 23 09:51:29 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/42477.
Jan 23 09:51:29 compute-1 systemd[1]: Removed slice User Slice of UID 42477.
Jan 23 09:51:29 compute-1 systemd[1]: user-42477.slice: Consumed 1min 22.559s CPU time.
Jan 23 09:51:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:29.832+0000 7fe273115140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:51:29 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:51:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:30.064+0000 7fe273115140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 09:51:30 compute-1 ceph-mon[80126]: 7.1e deep-scrub starts
Jan 23 09:51:30 compute-1 ceph-mon[80126]: 7.1e deep-scrub ok
Jan 23 09:51:30 compute-1 ceph-mon[80126]: 5.9 scrub starts
Jan 23 09:51:30 compute-1 ceph-mon[80126]: 5.9 scrub ok
Jan 23 09:51:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:30.367+0000 7fe273115140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 09:51:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:51:30.439+0000 7fe273115140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: mgr load Constructed class from module: dashboard
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: [dashboard INFO root] Starting engine...
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x5616c891d860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 09:51:30 compute-1 ceph-mgr[80432]: [dashboard INFO root] Engine started...
Jan 23 09:51:30 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.8 scrub starts
Jan 23 09:51:30 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.8 scrub ok
Jan 23 09:51:31 compute-1 ceph-mon[80126]: 7.2 scrub starts
Jan 23 09:51:31 compute-1 ceph-mon[80126]: 7.2 scrub ok
Jan 23 09:51:31 compute-1 ceph-mon[80126]: Standby manager daemon compute-1.jmakme restarted
Jan 23 09:51:31 compute-1 ceph-mon[80126]: Standby manager daemon compute-1.jmakme started
Jan 23 09:51:31 compute-1 ceph-mon[80126]: 4.8 scrub starts
Jan 23 09:51:31 compute-1 ceph-mon[80126]: 4.8 scrub ok
Jan 23 09:51:31 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.a deep-scrub starts
Jan 23 09:51:31 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.a deep-scrub ok
Jan 23 09:51:32 compute-1 ceph-mon[80126]: 7.3 scrub starts
Jan 23 09:51:32 compute-1 ceph-mon[80126]: mgrmap e20: compute-0.nbdygh(active, starting, since 6s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:32 compute-1 ceph-mon[80126]: 7.3 scrub ok
Jan 23 09:51:32 compute-1 ceph-mon[80126]: 6.a deep-scrub starts
Jan 23 09:51:32 compute-1 ceph-mon[80126]: 6.a deep-scrub ok
Jan 23 09:51:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e33 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:32 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.a scrub starts
Jan 23 09:51:32 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.a scrub ok
Jan 23 09:51:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e34 e34: 2 total, 2 up, 2 in
Jan 23 09:51:33 compute-1 ceph-mon[80126]: 7.e scrub starts
Jan 23 09:51:33 compute-1 ceph-mon[80126]: 7.e scrub ok
Jan 23 09:51:33 compute-1 ceph-mon[80126]: Standby manager daemon compute-2.uczrot restarted
Jan 23 09:51:33 compute-1 ceph-mon[80126]: Standby manager daemon compute-2.uczrot started
Jan 23 09:51:33 compute-1 ceph-mon[80126]: 3.a scrub starts
Jan 23 09:51:33 compute-1 ceph-mon[80126]: 3.a scrub ok
Jan 23 09:51:33 compute-1 ceph-mon[80126]: Active manager daemon compute-0.nbdygh restarted
Jan 23 09:51:33 compute-1 ceph-mon[80126]: Activating manager daemon compute-0.nbdygh
Jan 23 09:51:33 compute-1 ceph-mon[80126]: osdmap e34: 2 total, 2 up, 2 in
Jan 23 09:51:33 compute-1 ceph-mon[80126]: mgrmap e21: compute-0.nbdygh(active, starting, since 0.0330109s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-0.nbdygh", "id": "compute-0.nbdygh"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-2.uczrot", "id": "compute-2.uczrot"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-1.jmakme", "id": "compute-1.jmakme"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: Manager daemon compute-0.nbdygh is now available
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 09:51:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 09:51:33 compute-1 sshd-session[82136]: Accepted publickey for ceph-admin from 192.168.122.100 port 38422 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:51:33 compute-1 systemd-logind[807]: New session 34 of user ceph-admin.
Jan 23 09:51:33 compute-1 systemd[1]: Created slice User Slice of UID 42477.
Jan 23 09:51:33 compute-1 systemd[1]: Starting User Runtime Directory /run/user/42477...
Jan 23 09:51:33 compute-1 systemd[1]: Finished User Runtime Directory /run/user/42477.
Jan 23 09:51:33 compute-1 systemd[1]: Starting User Manager for UID 42477...
Jan 23 09:51:33 compute-1 systemd[82140]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:51:33 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.e deep-scrub starts
Jan 23 09:51:33 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.e deep-scrub ok
Jan 23 09:51:33 compute-1 systemd[82140]: Queued start job for default target Main User Target.
Jan 23 09:51:33 compute-1 systemd[82140]: Created slice User Application Slice.
Jan 23 09:51:33 compute-1 systemd[82140]: Started Mark boot as successful after the user session has run 2 minutes.
Jan 23 09:51:33 compute-1 systemd[82140]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 09:51:33 compute-1 systemd[82140]: Reached target Paths.
Jan 23 09:51:33 compute-1 systemd[82140]: Reached target Timers.
Jan 23 09:51:33 compute-1 systemd[82140]: Starting D-Bus User Message Bus Socket...
Jan 23 09:51:33 compute-1 systemd[82140]: Starting Create User's Volatile Files and Directories...
Jan 23 09:51:33 compute-1 systemd[82140]: Listening on D-Bus User Message Bus Socket.
Jan 23 09:51:33 compute-1 systemd[82140]: Reached target Sockets.
Jan 23 09:51:33 compute-1 systemd[82140]: Finished Create User's Volatile Files and Directories.
Jan 23 09:51:33 compute-1 systemd[82140]: Reached target Basic System.
Jan 23 09:51:33 compute-1 systemd[82140]: Reached target Main User Target.
Jan 23 09:51:33 compute-1 systemd[82140]: Startup finished in 155ms.
Jan 23 09:51:33 compute-1 systemd[1]: Started User Manager for UID 42477.
Jan 23 09:51:33 compute-1 systemd[1]: Started Session 34 of User ceph-admin.
Jan 23 09:51:33 compute-1 sshd-session[82136]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:51:33 compute-1 sudo[82156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:33 compute-1 sudo[82156]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:33 compute-1 sudo[82156]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:33 compute-1 sudo[82181]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:51:33 compute-1 sudo[82181]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e2 new map
Jan 23 09:51:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e2 print_map
                                           e2
                                           btime 2026-01-23T09:51:34:000852+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:51:34.000760+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
Jan 23 09:51:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e35 e35: 2 total, 2 up, 2 in
Jan 23 09:51:34 compute-1 podman[82275]: 2026-01-23 09:51:34.55739482 +0000 UTC m=+0.101723451 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:51:34 compute-1 ceph-mon[80126]: 7.f deep-scrub starts
Jan 23 09:51:34 compute-1 ceph-mon[80126]: 7.f deep-scrub ok
Jan 23 09:51:34 compute-1 ceph-mon[80126]: 3.e deep-scrub starts
Jan 23 09:51:34 compute-1 ceph-mon[80126]: 3.e deep-scrub ok
Jan 23 09:51:34 compute-1 ceph-mon[80126]: mgrmap e22: compute-0.nbdygh(active, since 1.09481s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:34 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch
Jan 23 09:51:34 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch
Jan 23 09:51:34 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch
Jan 23 09:51:34 compute-1 ceph-mon[80126]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN)
Jan 23 09:51:34 compute-1 ceph-mon[80126]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX)
Jan 23 09:51:34 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished
Jan 23 09:51:34 compute-1 ceph-mon[80126]: osdmap e35: 2 total, 2 up, 2 in
Jan 23 09:51:34 compute-1 ceph-mon[80126]: fsmap cephfs:0
Jan 23 09:51:34 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:34 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.9 scrub starts
Jan 23 09:51:34 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.9 scrub ok
Jan 23 09:51:34 compute-1 podman[82275]: 2026-01-23 09:51:34.684437733 +0000 UTC m=+0.228766324 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:51:35 compute-1 podman[82396]: 2026-01-23 09:51:35.220751629 +0000 UTC m=+0.061746475 container exec 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:51:35 compute-1 podman[82396]: 2026-01-23 09:51:35.231876857 +0000 UTC m=+0.072871673 container exec_died 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:51:35 compute-1 sudo[82181]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:35 compute-1 sudo[82435]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:35 compute-1 sudo[82435]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:35 compute-1 sudo[82435]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:35 compute-1 sudo[82460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:51:35 compute-1 sudo[82460]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:35 compute-1 ceph-mon[80126]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 09:51:35 compute-1 ceph-mon[80126]: 7.9 scrub starts
Jan 23 09:51:35 compute-1 ceph-mon[80126]: 7.9 scrub ok
Jan 23 09:51:35 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Bus STARTING
Jan 23 09:51:35 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Serving on http://192.168.122.100:8765
Jan 23 09:51:35 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Serving on https://192.168.122.100:7150
Jan 23 09:51:35 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Bus STARTED
Jan 23 09:51:35 compute-1 ceph-mon[80126]: [23/Jan/2026:09:51:34] ENGINE Client ('192.168.122.100', 48072) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 09:51:35 compute-1 ceph-mon[80126]: 4.9 scrub starts
Jan 23 09:51:35 compute-1 ceph-mon[80126]: 4.9 scrub ok
Jan 23 09:51:35 compute-1 ceph-mon[80126]: pgmap v5: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:35 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-1 ceph-mon[80126]: from='client.14376 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:35 compute-1 ceph-mon[80126]: Saving service mds.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 09:51:35 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:35 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Jan 23 09:51:35 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Jan 23 09:51:36 compute-1 sudo[82460]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:36 compute-1 sudo[82516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:51:36 compute-1 sudo[82516]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:36 compute-1 sudo[82516]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:36 compute-1 sudo[82541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 09:51:36 compute-1 sudo[82541]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:36 compute-1 sudo[82541]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:36 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Jan 23 09:51:36 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Jan 23 09:51:36 compute-1 ceph-mon[80126]: 2.c scrub starts
Jan 23 09:51:36 compute-1 ceph-mon[80126]: 2.c scrub ok
Jan 23 09:51:36 compute-1 ceph-mon[80126]: 3.10 scrub starts
Jan 23 09:51:36 compute-1 ceph-mon[80126]: 3.10 scrub ok
Jan 23 09:51:36 compute-1 ceph-mon[80126]: mgrmap e23: compute-0.nbdygh(active, since 2s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]: dispatch
Jan 23 09:51:36 compute-1 sudo[82584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:51:36 compute-1 sudo[82584]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:36 compute-1 sudo[82584]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 sudo[82609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:51:37 compute-1 sudo[82609]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82609]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 sudo[82634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:37 compute-1 sudo[82634]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82634]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 sudo[82659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:37 compute-1 sudo[82659]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82659]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 sudo[82684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:37 compute-1 sudo[82684]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82684]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e36 e36: 2 total, 2 up, 2 in
Jan 23 09:51:37 compute-1 sudo[82732]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:37 compute-1 sudo[82732]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82732]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 sudo[82757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:51:37 compute-1 sudo[82757]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e36 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:37 compute-1 sudo[82757]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 sudo[82782]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 23 09:51:37 compute-1 sudo[82782]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82782]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 sudo[82807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:37 compute-1 sudo[82807]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82807]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 sudo[82832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:37 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.f scrub starts
Jan 23 09:51:37 compute-1 sudo[82832]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82832]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.f scrub ok
Jan 23 09:51:37 compute-1 sudo[82857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:37 compute-1 sudo[82857]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82857]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 ceph-mon[80126]: 2.d scrub starts
Jan 23 09:51:37 compute-1 ceph-mon[80126]: 2.d scrub ok
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='client.14385 -' entity='client.admin' cmd=[{"prefix": "nfs cluster create", "cluster_id": "cephfs", "ingress": true, "virtual_ip": "192.168.122.2/24", "ingress_mode": "haproxy-protocol", "placement": "compute-0 compute-1 compute-2 ", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 09:51:37 compute-1 ceph-mon[80126]: 3.13 scrub starts
Jan 23 09:51:37 compute-1 ceph-mon[80126]: 3.13 scrub ok
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]': finished
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:51:37 compute-1 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 09:51:37 compute-1 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 09:51:37 compute-1 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 09:51:37 compute-1 ceph-mon[80126]: pgmap v6: 193 pgs: 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool create", "pool": ".nfs", "yes_i_really_mean_it": true}]': finished
Jan 23 09:51:37 compute-1 ceph-mon[80126]: osdmap e36: 2 total, 2 up, 2 in
Jan 23 09:51:37 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]: dispatch
Jan 23 09:51:37 compute-1 sudo[82882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:37 compute-1 sudo[82882]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82882]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:37 compute-1 sudo[82907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:37 compute-1 sudo[82907]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:37 compute-1 sudo[82907]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 sudo[82955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:38 compute-1 sudo[82955]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[82955]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 sudo[82980]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:51:38 compute-1 sudo[82980]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[82980]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 sudo[83005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:38 compute-1 sudo[83005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[83005]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e37 e37: 2 total, 2 up, 2 in
Jan 23 09:51:38 compute-1 sudo[83030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:51:38 compute-1 sudo[83030]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[83030]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 sudo[83055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:51:38 compute-1 sudo[83055]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[83055]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 sudo[83080]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-1 sudo[83080]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[83080]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 sudo[83105]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:38 compute-1 sudo[83105]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[83105]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 sudo[83130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-1 sudo[83130]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[83130]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.15 scrub starts
Jan 23 09:51:38 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.15 scrub ok
Jan 23 09:51:38 compute-1 sudo[83178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-1 sudo[83178]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[83178]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 sudo[83203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:51:38 compute-1 sudo[83203]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[83203]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:38 compute-1 sudo[83228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:38 compute-1 sudo[83228]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:38 compute-1 sudo[83228]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-1 ceph-mon[80126]: 2.e scrub starts
Jan 23 09:51:39 compute-1 ceph-mon[80126]: 2.e scrub ok
Jan 23 09:51:39 compute-1 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:39 compute-1 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:39 compute-1 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:51:39 compute-1 ceph-mon[80126]: 3.f scrub starts
Jan 23 09:51:39 compute-1 ceph-mon[80126]: 3.f scrub ok
Jan 23 09:51:39 compute-1 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:39 compute-1 ceph-mon[80126]: mgrmap e24: compute-0.nbdygh(active, since 5s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:39 compute-1 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:39 compute-1 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:51:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool application enable", "pool": ".nfs", "app": "nfs"}]': finished
Jan 23 09:51:39 compute-1 ceph-mon[80126]: osdmap e37: 2 total, 2 up, 2 in
Jan 23 09:51:39 compute-1 ceph-mon[80126]: Saving service nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 09:51:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:39 compute-1 ceph-mon[80126]: Saving service ingress.nfs.cephfs spec with placement compute-0;compute-1;compute-2
Jan 23 09:51:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:39 compute-1 sudo[83253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:39 compute-1 sudo[83253]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-1 sudo[83253]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-1 sudo[83278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:51:39 compute-1 sudo[83278]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-1 sudo[83278]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-1 sudo[83303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:39 compute-1 sudo[83303]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-1 sudo[83303]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e38 e38: 2 total, 2 up, 2 in
Jan 23 09:51:39 compute-1 sudo[83328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:51:39 compute-1 sudo[83328]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-1 sudo[83328]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-1 sudo[83353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:39 compute-1 sudo[83353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-1 sudo[83353]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-1 sudo[83401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:39 compute-1 sudo[83401]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-1 sudo[83401]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-1 sudo[83426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:51:39 compute-1 sudo[83426]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-1 sudo[83426]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:39 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Jan 23 09:51:39 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Jan 23 09:51:39 compute-1 sudo[83451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:39 compute-1 sudo[83451]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:51:39 compute-1 sudo[83451]: pam_unix(sudo:session): session closed for user root
Jan 23 09:51:40 compute-1 ceph-mon[80126]: 2.10 scrub starts
Jan 23 09:51:40 compute-1 ceph-mon[80126]: 2.10 scrub ok
Jan 23 09:51:40 compute-1 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:40 compute-1 ceph-mon[80126]: 4.15 scrub starts
Jan 23 09:51:40 compute-1 ceph-mon[80126]: 4.15 scrub ok
Jan 23 09:51:40 compute-1 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:40 compute-1 ceph-mon[80126]: pgmap v9: 194 pgs: 1 unknown, 193 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:40 compute-1 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:51:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-1 ceph-mon[80126]: 2.13 scrub starts
Jan 23 09:51:40 compute-1 ceph-mon[80126]: 2.13 scrub ok
Jan 23 09:51:40 compute-1 ceph-mon[80126]: osdmap e38: 2 total, 2 up, 2 in
Jan 23 09:51:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-1 ceph-mon[80126]: mgrmap e25: compute-0.nbdygh(active, since 6s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:51:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-1 ceph-mon[80126]: 5.15 scrub starts
Jan 23 09:51:40 compute-1 ceph-mon[80126]: 5.15 scrub ok
Jan 23 09:51:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:40 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.16 scrub starts
Jan 23 09:51:40 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.16 scrub ok
Jan 23 09:51:41 compute-1 ceph-mon[80126]: Deploying daemon node-exporter.compute-2 on compute-2
Jan 23 09:51:41 compute-1 ceph-mon[80126]: 2.15 scrub starts
Jan 23 09:51:41 compute-1 ceph-mon[80126]: 2.15 scrub ok
Jan 23 09:51:41 compute-1 ceph-mon[80126]: 5.16 scrub starts
Jan 23 09:51:41 compute-1 ceph-mon[80126]: 5.16 scrub ok
Jan 23 09:51:41 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Jan 23 09:51:41 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Jan 23 09:51:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:42 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Jan 23 09:51:42 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Jan 23 09:51:43 compute-1 ceph-mon[80126]: pgmap v11: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Jan 23 09:51:43 compute-1 ceph-mon[80126]: 2.19 scrub starts
Jan 23 09:51:43 compute-1 ceph-mon[80126]: 2.19 scrub ok
Jan 23 09:51:43 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/992291970' entity='client.admin' cmd=[{"prefix": "auth import"}]: dispatch
Jan 23 09:51:43 compute-1 ceph-mon[80126]: 4.13 scrub starts
Jan 23 09:51:43 compute-1 ceph-mon[80126]: 4.13 scrub ok
Jan 23 09:51:43 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Jan 23 09:51:43 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Jan 23 09:51:44 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1f deep-scrub starts
Jan 23 09:51:44 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.1f deep-scrub ok
Jan 23 09:51:44 compute-1 ceph-mon[80126]: 7.4 scrub starts
Jan 23 09:51:44 compute-1 ceph-mon[80126]: 7.4 scrub ok
Jan 23 09:51:44 compute-1 ceph-mon[80126]: 5.11 scrub starts
Jan 23 09:51:44 compute-1 ceph-mon[80126]: 5.11 scrub ok
Jan 23 09:51:44 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/992291970' entity='client.admin' cmd='[{"prefix": "auth import"}]': finished
Jan 23 09:51:44 compute-1 ceph-mon[80126]: pgmap v12: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 29 KiB/s rd, 0 B/s wr, 12 op/s
Jan 23 09:51:44 compute-1 ceph-mon[80126]: 7.8 scrub starts
Jan 23 09:51:44 compute-1 ceph-mon[80126]: 7.8 scrub ok
Jan 23 09:51:44 compute-1 ceph-mon[80126]: 3.14 scrub starts
Jan 23 09:51:44 compute-1 ceph-mon[80126]: 3.14 scrub ok
Jan 23 09:51:44 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:44 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1904452043' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Jan 23 09:51:45 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.10 scrub starts
Jan 23 09:51:45 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 5.10 scrub ok
Jan 23 09:51:46 compute-1 ceph-mon[80126]: 7.a scrub starts
Jan 23 09:51:46 compute-1 ceph-mon[80126]: 7.a scrub ok
Jan 23 09:51:46 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:46 compute-1 ceph-mon[80126]: 5.1f deep-scrub starts
Jan 23 09:51:46 compute-1 ceph-mon[80126]: 5.1f deep-scrub ok
Jan 23 09:51:46 compute-1 ceph-mon[80126]: pgmap v13: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 23 KiB/s rd, 0 B/s wr, 9 op/s
Jan 23 09:51:46 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:46 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:46 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:51:46 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:51:46 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:46 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:51:46 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:46 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1e deep-scrub starts
Jan 23 09:51:46 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1e deep-scrub ok
Jan 23 09:51:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e38 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:47 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1c scrub starts
Jan 23 09:51:47 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.1c scrub ok
Jan 23 09:51:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e39 e39: 3 total, 2 up, 3 in
Jan 23 09:51:48 compute-1 ceph-mon[80126]: 7.14 deep-scrub starts
Jan 23 09:51:48 compute-1 ceph-mon[80126]: 7.14 deep-scrub ok
Jan 23 09:51:48 compute-1 ceph-mon[80126]: 5.10 scrub starts
Jan 23 09:51:48 compute-1 ceph-mon[80126]: 5.10 scrub ok
Jan 23 09:51:48 compute-1 ceph-mon[80126]: 7.b scrub starts
Jan 23 09:51:48 compute-1 ceph-mon[80126]: 7.b scrub ok
Jan 23 09:51:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/985471869' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 09:51:48 compute-1 ceph-mon[80126]: 6.1e deep-scrub starts
Jan 23 09:51:48 compute-1 ceph-mon[80126]: 6.1e deep-scrub ok
Jan 23 09:51:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1205331151' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 09:51:48 compute-1 ceph-mon[80126]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]: dispatch
Jan 23 09:51:48 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.12 scrub starts
Jan 23 09:51:48 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.12 scrub ok
Jan 23 09:51:49 compute-1 ceph-mon[80126]: pgmap v14: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 20 KiB/s rd, 0 B/s wr, 8 op/s
Jan 23 09:51:49 compute-1 ceph-mon[80126]: 7.10 scrub starts
Jan 23 09:51:49 compute-1 ceph-mon[80126]: 7.10 scrub ok
Jan 23 09:51:49 compute-1 ceph-mon[80126]: 6.1c scrub starts
Jan 23 09:51:49 compute-1 ceph-mon[80126]: 6.1c scrub ok
Jan 23 09:51:49 compute-1 ceph-mon[80126]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "2edb8fa1-89ea-44cd-9b6e-9f4d89095397"}]': finished
Jan 23 09:51:49 compute-1 ceph-mon[80126]: osdmap e39: 3 total, 2 up, 3 in
Jan 23 09:51:49 compute-1 ceph-mon[80126]: 7.13 scrub starts
Jan 23 09:51:49 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:51:49 compute-1 ceph-mon[80126]: 7.13 scrub ok
Jan 23 09:51:49 compute-1 ceph-mon[80126]: 6.12 scrub starts
Jan 23 09:51:49 compute-1 ceph-mon[80126]: 6.12 scrub ok
Jan 23 09:51:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3212942412' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch
Jan 23 09:51:49 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:49 compute-1 ceph-mon[80126]: pgmap v16: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3560526778' entity='client.admin' cmd=[{"prefix": "auth get", "entity": "client.openstack"}]: dispatch
Jan 23 09:51:49 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.17 scrub starts
Jan 23 09:51:49 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.17 scrub ok
Jan 23 09:51:50 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Jan 23 09:51:50 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Jan 23 09:51:51 compute-1 ceph-mon[80126]: 7.1d scrub starts
Jan 23 09:51:51 compute-1 ceph-mon[80126]: 7.1d scrub ok
Jan 23 09:51:51 compute-1 ceph-mon[80126]: 6.17 scrub starts
Jan 23 09:51:51 compute-1 ceph-mon[80126]: 6.17 scrub ok
Jan 23 09:51:52 compute-1 ceph-mon[80126]: 6.15 scrub starts
Jan 23 09:51:52 compute-1 ceph-mon[80126]: 6.15 scrub ok
Jan 23 09:51:52 compute-1 ceph-mon[80126]: pgmap v17: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:53 compute-1 ceph-mon[80126]: from='client.14424 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 23 09:51:53 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:53 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:51:54 compute-1 ceph-mon[80126]: pgmap v18: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:56 compute-1 ceph-mon[80126]: pgmap v19: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:51:56 compute-1 ceph-mon[80126]: from='client.14430 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 23 09:51:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch
Jan 23 09:51:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:51:56 compute-1 ceph-mon[80126]: Deploying daemon osd.2 on compute-2
Jan 23 09:51:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:51:57 compute-1 ceph-mon[80126]: pgmap v20: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:00 compute-1 ceph-mon[80126]: pgmap v21: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e39 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:02 compute-1 ceph-mon[80126]: from='client.14439 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 23 09:52:02 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:02 compute-1 ceph-mon[80126]: pgmap v22: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:02 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:03 compute-1 ceph-mon[80126]: from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 09:52:03 compute-1 ceph-mon[80126]: pgmap v23: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:03 compute-1 ceph-mon[80126]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch
Jan 23 09:52:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e40 e40: 3 total, 2 up, 3 in
Jan 23 09:52:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e41 e41: 3 total, 2 up, 3 in
Jan 23 09:52:07 compute-1 ceph-mon[80126]: purged_snaps scrub starts
Jan 23 09:52:07 compute-1 ceph-mon[80126]: purged_snaps scrub ok
Jan 23 09:52:07 compute-1 ceph-mon[80126]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished
Jan 23 09:52:07 compute-1 ceph-mon[80126]: from='osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 09:52:07 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:07 compute-1 ceph-mon[80126]: osdmap e40: 3 total, 2 up, 3 in
Jan 23 09:52:07 compute-1 ceph-mon[80126]: pgmap v25: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:07 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:07 compute-1 ceph-mon[80126]: from='osd.2 ' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]: dispatch
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.784867287s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231704712s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.139515877s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 145.586380005s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.158160210s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.605072021s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.784867287s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231704712s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.158160210s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.605072021s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.139515877s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 145.586380005s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157796860s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.605056763s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157771111s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.605056763s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157796860s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.605056763s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157771111s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.605056763s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157330513s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.605087280s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157330513s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.605087280s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156840324s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604675293s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156840324s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604675293s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.141293526s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 145.589202881s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157002449s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604904175s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.157002449s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604904175s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156695366s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604660034s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.141293526s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 145.589202881s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.141396523s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 145.589385986s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.141396523s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 145.589385986s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783594131s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231658936s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783594131s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231658936s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156449318s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604644775s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156449318s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604644775s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156120300s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604370117s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156564713s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604827881s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156564713s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604827881s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156120300s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604370117s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783216476s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231521606s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783216476s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231521606s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.156695366s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604660034s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155824661s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604232788s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155824661s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604232788s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783000946s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231506348s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.783000946s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231506348s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155418396s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603988647s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155418396s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603988647s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.140769005s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 active pruub 145.589492798s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=41 pruub=10.140769005s) [] r=-1 lpr=41 pi=[27,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 145.589492798s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782610893s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231399536s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782610893s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231399536s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155240059s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604156494s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155240059s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604156494s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154916763s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603881836s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154916763s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603881836s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155308723s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.604339600s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.155308723s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.604339600s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154391289s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603485107s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154391289s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603485107s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154301643s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603439331s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154301643s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603439331s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782231331s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231414795s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154086113s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 active pruub 148.603332520s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782231331s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231414795s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=41 pruub=13.154086113s) [] r=-1 lpr=41 pi=[30,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 148.603332520s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782031059s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 active pruub 149.231353760s@ mbc={}] PeeringState::start_peering_interval up [0] -> [], acting [0] -> [], acting_primary 0 -> -1, up_primary 0 -> -1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:07 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 41 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=41 pruub=13.782031059s) [] r=-1 lpr=41 pi=[24,41)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 149.231353760s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:08 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:08 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.yzflfx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:08 compute-1 ceph-mon[80126]: pgmap v26: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:08 compute-1 ceph-mon[80126]: from='client.14445 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Jan 23 09:52:08 compute-1 ceph-mon[80126]: from='osd.2 ' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=compute-2", "root=default"]}]': finished
Jan 23 09:52:08 compute-1 ceph-mon[80126]: osdmap e41: 3 total, 2 up, 3 in
Jan 23 09:52:08 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:08 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-2.yzflfx", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:08 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:08 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:08 compute-1 ceph-mon[80126]: Deploying daemon rgw.rgw.compute-2.yzflfx on compute-2
Jan 23 09:52:09 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:09 compute-1 ceph-mon[80126]: pgmap v28: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:09 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:10 compute-1 sudo[83476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:52:10 compute-1 sudo[83476]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:10 compute-1 sudo[83476]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:10 compute-1 sudo[83501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:52:10 compute-1 sudo[83501]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/237302038' entity='client.admin' cmd=[{"prefix": "status", "format": "json"}]: dispatch
Jan 23 09:52:11 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:11 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:11 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:11 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:11 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.syfcuk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:11 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-1.syfcuk", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:11 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:11 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:11 compute-1 podman[83566]: 2026-01-23 09:52:11.158596716 +0000 UTC m=+0.065331549 container create 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:52:11 compute-1 systemd[1]: Started libpod-conmon-44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a.scope.
Jan 23 09:52:11 compute-1 podman[83566]: 2026-01-23 09:52:11.136241359 +0000 UTC m=+0.042976292 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:11 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:52:11 compute-1 podman[83566]: 2026-01-23 09:52:11.252209386 +0000 UTC m=+0.158944329 container init 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, OSD_FLAVOR=default)
Jan 23 09:52:11 compute-1 podman[83566]: 2026-01-23 09:52:11.260181141 +0000 UTC m=+0.166915984 container start 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:52:11 compute-1 podman[83566]: 2026-01-23 09:52:11.263781352 +0000 UTC m=+0.170516225 container attach 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 09:52:11 compute-1 brave_bartik[83582]: 167 167
Jan 23 09:52:11 compute-1 systemd[1]: libpod-44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a.scope: Deactivated successfully.
Jan 23 09:52:11 compute-1 podman[83566]: 2026-01-23 09:52:11.269113126 +0000 UTC m=+0.175848029 container died 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 09:52:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-3b9e42ed369918945c1c16e408fdbf55bf82d3771d57a13416932a0bdaa12bd3-merged.mount: Deactivated successfully.
Jan 23 09:52:11 compute-1 podman[83566]: 2026-01-23 09:52:11.323616712 +0000 UTC m=+0.230351575 container remove 44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=brave_bartik, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325)
Jan 23 09:52:11 compute-1 systemd[1]: libpod-conmon-44df3673f9ebe542d5dc927cd03c720a52ac4374eda023c142e67d541546232a.scope: Deactivated successfully.
Jan 23 09:52:11 compute-1 systemd[1]: Reloading.
Jan 23 09:52:11 compute-1 systemd-rc-local-generator[83621]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:11 compute-1 systemd-sysv-generator[83627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:11 compute-1 systemd[1]: Reloading.
Jan 23 09:52:11 compute-1 systemd-rc-local-generator[83667]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:11 compute-1 systemd-sysv-generator[83670]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:11 compute-1 systemd[1]: Starting Ceph rgw.rgw.compute-1.syfcuk for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:52:12 compute-1 podman[83723]: 2026-01-23 09:52:12.242450254 +0000 UTC m=+0.095678374 container create f7c6ee44d2d8f0f1344e73a7947d3b52a3fbd5569023e052a4cc0ae65b64d98a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-1-syfcuk, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 09:52:12 compute-1 podman[83723]: 2026-01-23 09:52:12.17567846 +0000 UTC m=+0.028906630 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a584fb83515e1aa529260ee1676c377ccbe1795468353602314f453266ca130c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a584fb83515e1aa529260ee1676c377ccbe1795468353602314f453266ca130c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a584fb83515e1aa529260ee1676c377ccbe1795468353602314f453266ca130c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:12 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a584fb83515e1aa529260ee1676c377ccbe1795468353602314f453266ca130c/merged/var/lib/ceph/radosgw/ceph-rgw.rgw.compute-1.syfcuk supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:12 compute-1 podman[83723]: 2026-01-23 09:52:12.355851973 +0000 UTC m=+0.209080113 container init f7c6ee44d2d8f0f1344e73a7947d3b52a3fbd5569023e052a4cc0ae65b64d98a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-1-syfcuk, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:52:12 compute-1 podman[83723]: 2026-01-23 09:52:12.362762315 +0000 UTC m=+0.215990435 container start f7c6ee44d2d8f0f1344e73a7947d3b52a3fbd5569023e052a4cc0ae65b64d98a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-1-syfcuk, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:52:12 compute-1 bash[83723]: f7c6ee44d2d8f0f1344e73a7947d3b52a3fbd5569023e052a4cc0ae65b64d98a
Jan 23 09:52:12 compute-1 systemd[1]: Started Ceph rgw.rgw.compute-1.syfcuk for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:52:12 compute-1 radosgw[83743]: deferred set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:52:12 compute-1 radosgw[83743]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process radosgw, pid 2
Jan 23 09:52:12 compute-1 radosgw[83743]: framework: beast
Jan 23 09:52:12 compute-1 radosgw[83743]: framework conf key: endpoint, val: 192.168.122.101:8082
Jan 23 09:52:12 compute-1 radosgw[83743]: init_numa not setting numa affinity
Jan 23 09:52:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e41 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:12 compute-1 sudo[83501]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e42 e42: 3 total, 2 up, 3 in
Jan 23 09:52:15 compute-1 ceph-mon[80126]: Deploying daemon rgw.rgw.compute-1.syfcuk on compute-1
Jan 23 09:52:15 compute-1 ceph-mon[80126]: pgmap v29: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:15 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:16 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e43 e43: 3 total, 2 up, 3 in
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2988268721' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch
Jan 23 09:52:17 compute-1 ceph-mon[80126]: pgmap v30: 194 pgs: 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-1 ceph-mon[80126]: osdmap e42: 3 total, 2 up, 3 in
Jan 23 09:52:17 compute-1 ceph-mon[80126]: pgmap v32: 195 pgs: 1 unknown, 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2692084146' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]: dispatch
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": ".rgw.root","app": "rgw"}]': finished
Jan 23 09:52:17 compute-1 ceph-mon[80126]: osdmap e43: 3 total, 2 up, 3 in
Jan 23 09:52:17 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e43 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e44 e44: 3 total, 2 up, 3 in
Jan 23 09:52:19 compute-1 radosgw[83743]: rgw main: failed to create zone with (17) File exists
Jan 23 09:52:19 compute-1 ceph-mon[80126]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 09:52:19 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:19 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jbpfwf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:19 compute-1 ceph-mon[80126]: pgmap v34: 195 pgs: 1 unknown, 194 active+clean; 449 KiB data, 54 MiB used, 40 GiB / 40 GiB avail
Jan 23 09:52:19 compute-1 ceph-mon[80126]: OSD bench result of 2077.197482 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Jan 23 09:52:19 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:19 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.rgw.rgw.compute-0.jbpfwf", "caps": ["mon", "allow *", "mgr", "allow rw", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:19 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1421940163' entity='client.admin' cmd=[{"prefix": "osd get-require-min-compat-client"}]: dispatch
Jan 23 09:52:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e45 e45: 3 total, 3 up, 3 in
Jan 23 09:52:20 compute-1 radosgw[83743]: rgw main: failed to create zonegroup with (17) File exists
Jan 23 09:52:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e46 e46: 3 total, 3 up, 3 in
Jan 23 09:52:21 compute-1 ceph-mon[80126]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 09:52:21 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:21 compute-1 ceph-mon[80126]: pgmap v35: 195 pgs: 195 active+clean; 450 KiB data, 54 MiB used, 40 GiB / 40 GiB avail; 255 B/s rd, 511 B/s wr, 1 op/s
Jan 23 09:52:21 compute-1 ceph-mon[80126]: osdmap e44: 3 total, 2 up, 3 in
Jan 23 09:52:21 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:21 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:21 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:21 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:21 compute-1 ceph-mon[80126]: Deploying daemon rgw.rgw.compute-0.jbpfwf on compute-0
Jan 23 09:52:21 compute-1 ceph-mon[80126]: osd.2 [v2:192.168.122.102:6800/1020282776,v1:192.168.122.102:6801/1020282776] boot
Jan 23 09:52:21 compute-1 ceph-mon[80126]: osdmap e45: 3 total, 3 up, 3 in
Jan 23 09:52:21 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:52:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"} v 0)
Jan 23 09:52:22 compute-1 ceph-mon[80126]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-1 ceph-mon[80126]: pgmap v38: 195 pgs: 28 peering, 167 active+clean; 450 KiB data, 480 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 682 B/s wr, 1 op/s
Jan 23 09:52:22 compute-1 ceph-mon[80126]: osdmap e46: 3 total, 3 up, 3 in
Jan 23 09:52:22 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]: dispatch
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[10.0( empty local-lis/les=0/0 n=0 ec=46/46 lis/c=0/0 les/c/f=0/0/0 sis=46) [0] r=0 lpr=46 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.1f( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[7.1f( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[6.1c( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.15( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.18( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[6.1e( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[6.12( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[6.17( empty local-lis/les=30/31 n=0 ec=26/17 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[7.16( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.12( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.15( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[7.11( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.f( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.9( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.8( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[5.4( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.b( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.11( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[7.5( empty local-lis/les=27/28 n=0 ec=27/18 lis/c=27/27 les/c/f=28/28/0 sis=45) [2] r=-1 lpr=45 pi=[27,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.e( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.9( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[4.1( empty local-lis/les=30/31 n=0 ec=24/15 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e46 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[5.e( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.1a( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.5( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[3.1d( empty local-lis/les=30/31 n=0 ec=24/14 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[5.1a( empty local-lis/les=30/31 n=0 ec=26/16 lis/c=30/30 les/c/f=31/31/0 sis=45) [2] r=-1 lpr=45 pi=[30,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 45 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] PeeringState::start_peering_interval up [] -> [2], acting [] -> [2], acting_primary ? -> 2, up_primary ? -> 2, role -1 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.1d( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 46 pg[2.1c( empty local-lis/les=24/25 n=0 ec=24/13 lis/c=24/24 les/c/f=25/25/0 sis=45) [2] r=-1 lpr=45 pi=[24,45)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e47 e47: 3 total, 3 up, 3 in
Jan 23 09:52:24 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e48 e48: 3 total, 3 up, 3 in
Jan 23 09:52:24 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1010663506' entity='client.admin' cmd=[{"prefix": "versions", "format": "json"}]: dispatch
Jan 23 09:52:24 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 09:52:24 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.log","app": "rgw"}]': finished
Jan 23 09:52:24 compute-1 ceph-mon[80126]: osdmap e47: 3 total, 3 up, 3 in
Jan 23 09:52:24 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:24 compute-1 ceph-mon[80126]: pgmap v41: 196 pgs: 1 unknown, 28 peering, 167 active+clean; 450 KiB data, 480 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:52:24 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 48 pg[10.0( empty local-lis/les=46/48 n=0 ec=46/46 lis/c=0/0 les/c/f=0/0/0 sis=46) [0] r=0 lpr=46 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:25 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e49 e49: 3 total, 3 up, 3 in
Jan 23 09:52:25 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:25 compute-1 ceph-mon[80126]: osdmap e48: 3 total, 3 up, 3 in
Jan 23 09:52:25 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:26 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"} v 0)
Jan 23 09:52:26 compute-1 ceph-mon[80126]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-1 ceph-mon[80126]: Saving service rgw.rgw spec with placement compute-0;compute-1;compute-2
Jan 23 09:52:27 compute-1 ceph-mon[80126]: pgmap v43: 196 pgs: 1 creating+peering, 28 peering, 167 active+clean; 450 KiB data, 481 MiB used, 60 GiB / 60 GiB avail; 5.0 KiB/s rd, 795 B/s wr, 6 op/s
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:27 compute-1 ceph-mon[80126]: osdmap e49: 3 total, 3 up, 3 in
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]: dispatch
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.prgzmm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-2.prgzmm", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 09:52:27 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e50 e50: 3 total, 3 up, 3 in
Jan 23 09:52:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e50 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:28 compute-1 ceph-mon[80126]: Deploying daemon mds.cephfs.compute-2.prgzmm on compute-2
Jan 23 09:52:28 compute-1 ceph-mon[80126]: pgmap v45: 197 pgs: 1 unknown, 1 creating+peering, 195 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 4.7 KiB/s rd, 744 B/s wr, 6 op/s
Jan 23 09:52:28 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 09:52:28 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 09:52:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.control","app": "rgw"}]': finished
Jan 23 09:52:28 compute-1 ceph-mon[80126]: osdmap e50: 3 total, 3 up, 3 in
Jan 23 09:52:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e51 e51: 3 total, 3 up, 3 in
Jan 23 09:52:29 compute-1 ceph-mon[80126]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 09:52:29 compute-1 ceph-mon[80126]: osdmap e51: 3 total, 3 up, 3 in
Jan 23 09:52:29 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:30 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e52 e52: 3 total, 3 up, 3 in
Jan 23 09:52:30 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"} v 0)
Jan 23 09:52:30 compute-1 ceph-mon[80126]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:30 compute-1 ceph-mon[80126]: pgmap v48: 197 pgs: 197 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:52:30 compute-1 ceph-mon[80126]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 09:52:30 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:30 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:30 compute-1 ceph-mon[80126]: osdmap e52: 3 total, 3 up, 3 in
Jan 23 09:52:30 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:30 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:30 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:30 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:31 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e3 new map
Jan 23 09:52:31 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e3 print_map
                                           e3
                                           btime 2026-01-23T09:52:30:834166+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        2
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:51:34.000760+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        
                                           up        {}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-2.prgzmm{-1:24193} state up:standby seq 1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:31 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e4 new map
Jan 23 09:52:31 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e4 print_map
                                           e4
                                           btime 2026-01-23T09:52:31:070018+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        4
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:31.070004+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 0 members: 
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:creating seq 1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Jan 23 09:52:31 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 52 pg[12.0( empty local-lis/les=0/0 n=0 ec=52/52 lis/c=0/0 les/c/f=0/0/0 sis=52) [0] r=0 lpr=52 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:31 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e53 e53: 3 total, 3 up, 3 in
Jan 23 09:52:31 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 53 pg[12.0( empty local-lis/les=52/53 n=0 ec=52/52 lis/c=0/0 les/c/f=0/0/0 sis=52) [0] r=0 lpr=52 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:31 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"} v 0)
Jan 23 09:52:31 compute-1 ceph-mon[80126]: log_channel(audit) log [INF] : from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ymknms", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-0.ymknms", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: Deploying daemon mds.cephfs.compute-0.ymknms on compute-0
Jan 23 09:52:32 compute-1 ceph-mon[80126]: pgmap v50: 198 pgs: 1 unknown, 197 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:52:32 compute-1 ceph-mon[80126]: mds.? [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] up:boot
Jan 23 09:52:32 compute-1 ceph-mon[80126]: daemon mds.cephfs.compute-2.prgzmm assigned to filesystem cephfs as rank 0 (now has 1 ranks)
Jan 23 09:52:32 compute-1 ceph-mon[80126]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline)
Jan 23 09:52:32 compute-1 ceph-mon[80126]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds)
Jan 23 09:52:32 compute-1 ceph-mon[80126]: Cluster is now healthy
Jan 23 09:52:32 compute-1 ceph-mon[80126]: fsmap cephfs:0 1 up:standby
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.prgzmm"}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:creating}
Jan 23 09:52:32 compute-1 ceph-mon[80126]: daemon mds.cephfs.compute-2.prgzmm is now active in filesystem cephfs as rank 0
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool application enable","pool": "default.rgw.meta","app": "rgw"}]': finished
Jan 23 09:52:32 compute-1 ceph-mon[80126]: osdmap e53: 3 total, 3 up, 3 in
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3935157835' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1572426654' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]: dispatch
Jan 23 09:52:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e53 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e5 new map
Jan 23 09:52:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e5 print_map
                                           e5
                                           btime 2026-01-23T09:52:32:417167+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:32.417165+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        0
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
Jan 23 09:52:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e54 e54: 3 total, 3 up, 3 in
Jan 23 09:52:33 compute-1 ceph-mon[80126]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED)
Jan 23 09:52:33 compute-1 ceph-mon[80126]: mds.? [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] up:active
Jan 23 09:52:33 compute-1 ceph-mon[80126]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active}
Jan 23 09:52:33 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/681468082' entity='client.rgw.rgw.compute-0.jbpfwf' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 09:52:33 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-2.yzflfx' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 09:52:33 compute-1 ceph-mon[80126]: from='client.? ' entity='client.rgw.rgw.compute-1.syfcuk' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_autoscale_bias", "val": "4"}]': finished
Jan 23 09:52:33 compute-1 ceph-mon[80126]: osdmap e54: 3 total, 3 up, 3 in
Jan 23 09:52:33 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e6 new map
Jan 23 09:52:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e6 print_map
                                           e6
                                           btime 2026-01-23T09:52:33:487599+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:32.417165+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:34 compute-1 radosgw[83743]: v1 topic migration: starting v1 topic migration..
Jan 23 09:52:34 compute-1 radosgw[83743]: LDAP not started since no server URIs were provided in the configuration.
Jan 23 09:52:34 compute-1 radosgw[83743]: v1 topic migration: finished v1 topic migration
Jan 23 09:52:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-rgw-rgw-compute-1-syfcuk[83739]: 2026-01-23T09:52:34.017+0000 7f0b6033c980 -1 LDAP not started since no server URIs were provided in the configuration.
Jan 23 09:52:34 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000000 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000002 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-1 radosgw[83743]: framework: beast
Jan 23 09:52:34 compute-1 radosgw[83743]: framework conf key: ssl_certificate, val: config://rgw/cert/$realm/$zone.crt
Jan 23 09:52:34 compute-1 radosgw[83743]: framework conf key: ssl_private_key, val: config://rgw/cert/$realm/$zone.key
Jan 23 09:52:34 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000004 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-1 radosgw[83743]: starting handler: beast
Jan 23 09:52:34 compute-1 radosgw[83743]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:52:34 compute-1 radosgw[83743]: mgrc service_daemon_register rgw.24176 metadata {arch=x86_64,ceph_release=squid,ceph_version=ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable),ceph_version_short=19.2.3,container_hostname=compute-1,container_image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec,cpu=AMD EPYC-Rome Processor,distro=centos,distro_description=CentOS Stream 9,distro_version=9,frontend_config#0=beast endpoint=192.168.122.101:8082,frontend_type#0=beast,hostname=compute-1,id=rgw.compute-1.syfcuk,kernel_description=#1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026,kernel_version=5.14.0-661.el9.x86_64,mem_swap_kb=1048572,mem_total_kb=7864316,num_handles=1,os=Linux,pid=2,realm_id=,realm_name=,zone_id=75d0a494-c738-4cca-b87e-be71cfd0ed45,zone_name=default,zonegroup_id=6635d7c3-d02c-4c4b-90b3-4ee042e293d6,zonegroup_name=default}
Jan 23 09:52:34 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000006 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000008 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e55 e55: 3 total, 3 up, 3 in
Jan 23 09:52:34 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000010 to be held by another RGW process; skipping for now
Jan 23 09:52:34 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000012 to be held by another RGW process; skipping for now
Jan 23 09:52:35 compute-1 ceph-mon[80126]: pgmap v53: 198 pgs: 1 unknown, 197 active+clean; 450 KiB data, 81 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:52:35 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:35 compute-1 ceph-mon[80126]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled)
Jan 23 09:52:35 compute-1 ceph-mon[80126]: Cluster is now healthy
Jan 23 09:52:35 compute-1 ceph-mon[80126]: mds.? [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] up:boot
Jan 23 09:52:35 compute-1 ceph-mon[80126]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 1 up:standby
Jan 23 09:52:35 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.ymknms"}]: dispatch
Jan 23 09:52:35 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000014 to be held by another RGW process; skipping for now
Jan 23 09:52:35 compute-1 sudo[84365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:52:35 compute-1 sudo[84365]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:35 compute-1 sudo[84365]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:35 compute-1 sudo[84390]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:52:35 compute-1 sudo[84390]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:35 compute-1 podman[84454]: 2026-01-23 09:52:35.891646965 +0000 UTC m=+0.087664779 container create a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:52:35 compute-1 podman[84454]: 2026-01-23 09:52:35.827999266 +0000 UTC m=+0.024017110 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:36 compute-1 systemd[1]: Started libpod-conmon-a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783.scope.
Jan 23 09:52:36 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:52:36 compute-1 podman[84454]: 2026-01-23 09:52:36.06575423 +0000 UTC m=+0.261772144 container init a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 09:52:36 compute-1 podman[84454]: 2026-01-23 09:52:36.086058683 +0000 UTC m=+0.282076507 container start a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Jan 23 09:52:36 compute-1 podman[84454]: 2026-01-23 09:52:36.089932743 +0000 UTC m=+0.285950577 container attach a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:52:36 compute-1 affectionate_hellman[84469]: 167 167
Jan 23 09:52:36 compute-1 systemd[1]: libpod-a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783.scope: Deactivated successfully.
Jan 23 09:52:36 compute-1 conmon[84469]: conmon a3b467b128f2927a220b <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783.scope/container/memory.events
Jan 23 09:52:36 compute-1 podman[84454]: 2026-01-23 09:52:36.09668308 +0000 UTC m=+0.292700894 container died a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325)
Jan 23 09:52:36 compute-1 systemd[1]: var-lib-containers-storage-overlay-c6c21647047ab91f3891dddc234e133b90e1a36c1c8aa506f5557ff081485538-merged.mount: Deactivated successfully.
Jan 23 09:52:36 compute-1 podman[84454]: 2026-01-23 09:52:36.144043828 +0000 UTC m=+0.340061642 container remove a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=affectionate_hellman, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:52:36 compute-1 systemd[1]: libpod-conmon-a3b467b128f2927a220bec9e920d1ef41f975dad4ff9bdefcf8137db88073783.scope: Deactivated successfully.
Jan 23 09:52:36 compute-1 systemd[1]: Reloading.
Jan 23 09:52:36 compute-1 systemd-rc-local-generator[84512]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:36 compute-1 systemd-sysv-generator[84516]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:36 compute-1 systemd[1]: Reloading.
Jan 23 09:52:36 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e56 e56: 3 total, 3 up, 3 in
Jan 23 09:52:36 compute-1 systemd-rc-local-generator[84554]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:36 compute-1 systemd-sysv-generator[84558]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:36 compute-1 ceph-mon[80126]: osdmap e55: 3 total, 3 up, 3 in
Jan 23 09:52:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:36 compute-1 ceph-mon[80126]: pgmap v55: 198 pgs: 198 active+clean; 456 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 11 KiB/s wr, 41 op/s
Jan 23 09:52:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bcvzvj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch
Jan 23 09:52:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.compute-1.bcvzvj", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished
Jan 23 09:52:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:36 compute-1 ceph-mon[80126]: Deploying daemon mds.cephfs.compute-1.bcvzvj on compute-1
Jan 23 09:52:36 compute-1 systemd[1]: Starting Ceph mds.cephfs.compute-1.bcvzvj for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:52:37 compute-1 podman[84611]: 2026-01-23 09:52:37.081922985 +0000 UTC m=+0.107530709 container create ce7c85a7584e9b88c49c1a58d395d558c61671826c812a66de294289df884d26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-1-bcvzvj, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Jan 23 09:52:37 compute-1 podman[84611]: 2026-01-23 09:52:36.997352784 +0000 UTC m=+0.022960508 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/561286b795295de8b21b8cbecef711aec0184dced6f9738ecdcf9b77b99ba6e0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/561286b795295de8b21b8cbecef711aec0184dced6f9738ecdcf9b77b99ba6e0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/561286b795295de8b21b8cbecef711aec0184dced6f9738ecdcf9b77b99ba6e0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:37 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/561286b795295de8b21b8cbecef711aec0184dced6f9738ecdcf9b77b99ba6e0/merged/var/lib/ceph/mds/ceph-cephfs.compute-1.bcvzvj supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:37 compute-1 podman[84611]: 2026-01-23 09:52:37.307942117 +0000 UTC m=+0.333549921 container init ce7c85a7584e9b88c49c1a58d395d558c61671826c812a66de294289df884d26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-1-bcvzvj, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 09:52:37 compute-1 podman[84611]: 2026-01-23 09:52:37.314936432 +0000 UTC m=+0.340544166 container start ce7c85a7584e9b88c49c1a58d395d558c61671826c812a66de294289df884d26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-1-bcvzvj, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:52:37 compute-1 bash[84611]: ce7c85a7584e9b88c49c1a58d395d558c61671826c812a66de294289df884d26
Jan 23 09:52:37 compute-1 systemd[1]: Started Ceph mds.cephfs.compute-1.bcvzvj for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:52:37 compute-1 ceph-mds[84630]: set uid:gid to 167:167 (ceph:ceph)
Jan 23 09:52:37 compute-1 ceph-mds[84630]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mds, pid 2
Jan 23 09:52:37 compute-1 ceph-mds[84630]: main not setting numa affinity
Jan 23 09:52:37 compute-1 ceph-mds[84630]: pidfile_write: ignore empty --pid-file
Jan 23 09:52:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mds-cephfs-compute-1-bcvzvj[84626]: starting mds.cephfs.compute-1.bcvzvj at 
Jan 23 09:52:37 compute-1 sudo[84390]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e57 e57: 3 total, 3 up, 3 in
Jan 23 09:52:37 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Updating MDS map to version 6 from mon.2
Jan 23 09:52:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e57 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:38 compute-1 ceph-mon[80126]: osdmap e56: 3 total, 3 up, 3 in
Jan 23 09:52:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:38 compute-1 ceph-mon[80126]: pgmap v57: 229 pgs: 31 unknown, 198 active+clean; 456 KiB data, 81 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 8.8 KiB/s wr, 34 op/s
Jan 23 09:52:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:38 compute-1 ceph-mon[80126]: osdmap e57: 3 total, 3 up, 3 in
Jan 23 09:52:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e58 e58: 3 total, 3 up, 3 in
Jan 23 09:52:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e7 new map
Jan 23 09:52:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e7 print_map
                                           e7
                                           btime 2026-01-23T09:52:38:529421+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        5
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:32.417165+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 2 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:38 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Updating MDS map to version 7 from mon.2
Jan 23 09:52:38 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Monitors have assigned me to become a standby
Jan 23 09:52:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:39 compute-1 ceph-mon[80126]: 8.16 scrub starts
Jan 23 09:52:39 compute-1 ceph-mon[80126]: 8.16 scrub ok
Jan 23 09:52:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:39 compute-1 ceph-mon[80126]: osdmap e58: 3 total, 3 up, 3 in
Jan 23 09:52:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]: dispatch
Jan 23 09:52:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 09:52:39 compute-1 ceph-mon[80126]: mds.? [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] up:boot
Jan 23 09:52:39 compute-1 ceph-mon[80126]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 2 up:standby
Jan 23 09:52:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.bcvzvj"}]: dispatch
Jan 23 09:52:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e59 e59: 3 total, 3 up, 3 in
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[10.0( v 58'754 (0'0,58'754] local-lis/les=46/48 n=136 ec=46/46 lis/c=46/46 les/c/f=48/48/0 sis=59 pruub=8.448954582s) [0] r=0 lpr=59 pi=[46,59)/1 luod=58'752 crt=58'754 lcod 58'751 mlcod 58'751 active pruub 176.745971680s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[10.0( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=46/46 lis/c=46/46 les/c/f=48/48/0 sis=59 pruub=8.448954582s) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 58'751 mlcod 0'0 unknown pruub 176.745971680s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b85428 space 0x55a560c13ef0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bac208 space 0x55a560c331f0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bad108 space 0x55a560c12760 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd60c8 space 0x55a560c32900 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bac168 space 0x55a560c32eb0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b9cca8 space 0x55a560c329d0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bce848 space 0x55a560c000e0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd6528 space 0x55a560c13bb0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560baca28 space 0x55a560c32d10 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b85068 space 0x55a560c13c80 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b9c3e8 space 0x55a560c32830 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd7c48 space 0x55a560c33600 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bcfc48 space 0x55a560c32f80 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b85ce8 space 0x55a55fde4d10 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bacb68 space 0x55a560c32de0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b9d568 space 0x55a560c32690 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560ba20c8 space 0x55a560a901b0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd7ba8 space 0x55a560c12420 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560c365c8 space 0x55a560c32b70 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd6168 space 0x55a560aa4aa0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd7108 space 0x55a560a64900 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd6b68 space 0x55a560c13a10 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560937c48 space 0x55a560ae3c80 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b85928 space 0x55a560ae3ae0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd72e8 space 0x55a560c12c40 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bd5ba8 space 0x55a560c332c0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bac708 space 0x55a560c33120 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bceca8 space 0x55a560a90010 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560b9db08 space 0x55a560c32aa0 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0).collection(10.0_head 0x55a5603f78c0) operator()   moving buffer(0x55a560bce7a8 space 0x55a560a91c80 0x0~1000 clean)
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.15( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.14( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.17( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.11( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.10( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.10( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.e( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.8( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.f( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.d( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.a( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.6( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.4( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.1b( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.19( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.18( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[9.12( empty local-lis/les=0/0 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 59 pg[8.12( empty local-lis/les=0/0 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:40 compute-1 ceph-mon[80126]: Creating key for client.nfs.cephfs.0.0.compute-1.bawllm
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 09:52:40 compute-1 ceph-mon[80126]: pgmap v60: 260 pgs: 260 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 216 KiB/s rd, 0 B/s wr, 364 op/s
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:40 compute-1 ceph-mon[80126]: Ensuring nfs.cephfs.0 is in the ganesha grace table
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 09:52:40 compute-1 ceph-mon[80126]: 8.14 scrub starts
Jan 23 09:52:40 compute-1 ceph-mon[80126]: 8.14 scrub ok
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num", "val": "32"}]': finished
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".nfs", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:52:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": ".rgw.root", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:52:40 compute-1 ceph-mon[80126]: osdmap e59: 3 total, 3 up, 3 in
Jan 23 09:52:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e8 new map
Jan 23 09:52:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e8 print_map
                                           e8
                                           btime 2026-01-23T09:52:40:798611+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:39.805778+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 4 join_fscid=1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:41 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e60 e60: 3 total, 3 up, 3 in
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1b( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.7( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.12( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.11( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.10( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1f( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1e( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1d( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1c( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1a( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.19( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.18( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.5( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.4( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.6( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.3( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.b( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.8( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.d( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.9( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.a( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.c( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.e( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.f( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.2( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.13( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.14( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.15( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.16( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.17( v 58'754 lc 0'0 (0'0,58'754] local-lis/les=46/48 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.d( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.14( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.10( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.e( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.17( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.15( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.a( v 44'12 lc 44'8 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.1b( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.4( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.18( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.12( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.6( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=59/60 n=1 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.12( v 44'12 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.10( v 37'1 lc 0'0 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.11( v 44'12 lc 43'1 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[9.f( v 44'12 lc 0'0 (0'0,44'12] local-lis/les=59/60 n=0 ec=57/42 lis/c=57/57 les/c/f=58/58/0 sis=59) [0] r=0 lpr=59 pi=[57,59)/1 crt=44'12 mlcod 0'0 active+degraded m=3 mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.19( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.7( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.11( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.12( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1e( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1f( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[8.8( v 37'1 (0'0,37'1] local-lis/les=59/60 n=0 ec=56/36 lis/c=56/56 les/c/f=57/57/0 sis=59) [0] r=0 lpr=59 pi=[56,59)/1 crt=37'1 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1c( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1d( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1a( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.19( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.5( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1b( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.4( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.b( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.d( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.3( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.8( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.6( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.a( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.c( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.2( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.0( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=46/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 58'751 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.13( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.1( v 58'754 (0'0,58'754] local-lis/les=59/60 n=5 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.15( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.14( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.17( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.e( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 60 pg[10.f( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=46/46 les/c/f=48/48/0 sis=59) [0] r=0 lpr=59 pi=[46,59)/1 crt=58'754 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:41 compute-1 sudo[84650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:52:41 compute-1 sudo[84650]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:41 compute-1 sudo[84650]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:41 compute-1 sudo[84675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:52:41 compute-1 sudo[84675]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:42 compute-1 ceph-mon[80126]: 9.14 scrub starts
Jan 23 09:52:42 compute-1 ceph-mon[80126]: 9.14 scrub ok
Jan 23 09:52:42 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 09:52:42 compute-1 ceph-mon[80126]: mds.? [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] up:active
Jan 23 09:52:42 compute-1 ceph-mon[80126]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 2 up:standby
Jan 23 09:52:42 compute-1 ceph-mon[80126]: pgmap v62: 322 pgs: 62 unknown, 260 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 206 KiB/s rd, 0 B/s wr, 347 op/s
Jan 23 09:52:42 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:42 compute-1 ceph-mon[80126]: osdmap e60: 3 total, 3 up, 3 in
Jan 23 09:52:42 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:42 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.0.0.compute-1.bawllm-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:42 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:42 compute-1 podman[84740]: 2026-01-23 09:52:42.050667265 +0000 UTC m=+0.031386247 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:42 compute-1 podman[84740]: 2026-01-23 09:52:42.296460215 +0000 UTC m=+0.277179157 container create e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:52:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e9 new map
Jan 23 09:52:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).mds e9 print_map
                                           e9
                                           btime 2026-01-23T09:52:42:200523+0000
                                           enable_multiple, ever_enabled_multiple: 1,1
                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           legacy client fscid: 1
                                            
                                           Filesystem 'cephfs' (1)
                                           fs_name        cephfs
                                           epoch        8
                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                           created        2026-01-23T09:51:34.000760+0000
                                           modified        2026-01-23T09:52:39.805778+0000
                                           tableserver        0
                                           root        0
                                           session_timeout        60
                                           session_autoclose        300
                                           max_file_size        1099511627776
                                           max_xattr_size        65536
                                           required_client_features        {}
                                           last_failure        0
                                           last_failure_osd_epoch        0
                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes}
                                           max_mds        1
                                           in        0
                                           up        {0=24193}
                                           failed        
                                           damaged        
                                           stopped        
                                           data_pools        [7]
                                           metadata_pool        6
                                           inline_data        disabled
                                           balancer        
                                           bal_rank_mask        -1
                                           standby_count_wanted        1
                                           qdb_cluster        leader: 24193 members: 24193
                                           [mds.cephfs.compute-2.prgzmm{0:24193} state up:active seq 4 join_fscid=1 addr [v2:192.168.122.102:6804/1390112456,v1:192.168.122.102:6805/1390112456] compat {c=[1],r=[1],i=[1fff]}]
                                            
                                            
                                           Standby daemons:
                                            
                                           [mds.cephfs.compute-0.ymknms{-1:14502} state up:standby seq 3 join_fscid=1 addr [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] compat {c=[1],r=[1],i=[1fff]}]
                                           [mds.cephfs.compute-1.bcvzvj{-1:24200} state up:standby seq 2 join_fscid=1 addr [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] compat {c=[1],r=[1],i=[1fff]}]
Jan 23 09:52:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e61 e61: 3 total, 3 up, 3 in
Jan 23 09:52:42 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Updating MDS map to version 9 from mon.2
Jan 23 09:52:42 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 61 pg[12.0( empty local-lis/les=52/53 n=0 ec=52/52 lis/c=52/52 les/c/f=53/53/0 sis=61 pruub=13.239601135s) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active pruub 183.824020386s@ mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [0], acting_primary 0 -> 0, up_primary 0 -> 0, role 0 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:42 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 61 pg[12.0( empty local-lis/les=52/53 n=0 ec=52/52 lis/c=52/52 les/c/f=53/53/0 sis=61 pruub=13.239601135s) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown pruub 183.824020386s@ mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:42 compute-1 systemd[1]: Started libpod-conmon-e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348.scope.
Jan 23 09:52:42 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.a scrub starts
Jan 23 09:52:42 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:52:42 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.a scrub ok
Jan 23 09:52:42 compute-1 podman[84740]: 2026-01-23 09:52:42.562659523 +0000 UTC m=+0.543378455 container init e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:52:42 compute-1 podman[84740]: 2026-01-23 09:52:42.57559554 +0000 UTC m=+0.556314442 container start e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:52:42 compute-1 podman[84740]: 2026-01-23 09:52:42.580527912 +0000 UTC m=+0.561246844 container attach e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 23 09:52:42 compute-1 elastic_elbakyan[84756]: 167 167
Jan 23 09:52:42 compute-1 systemd[1]: libpod-e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348.scope: Deactivated successfully.
Jan 23 09:52:42 compute-1 podman[84740]: 2026-01-23 09:52:42.585526746 +0000 UTC m=+0.566245678 container died e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 23 09:52:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e61 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:42 compute-1 systemd[1]: var-lib-containers-storage-overlay-f945ce81a4cb406bb8313d4d2c99099af9fd1612f9651a19907dd2f77fde1b38-merged.mount: Deactivated successfully.
Jan 23 09:52:42 compute-1 podman[84740]: 2026-01-23 09:52:42.778475291 +0000 UTC m=+0.759194193 container remove e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=elastic_elbakyan, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, ceph=True, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 09:52:42 compute-1 systemd[1]: libpod-conmon-e7b7479845bb1bc424ed372993a8a0247c16c90f8c7d8d2c1dc0925bceb66348.scope: Deactivated successfully.
Jan 23 09:52:43 compute-1 ceph-mon[80126]: Rados config object exists: conf-nfs.cephfs
Jan 23 09:52:43 compute-1 ceph-mon[80126]: Creating key for client.nfs.cephfs.0.0.compute-1.bawllm-rgw
Jan 23 09:52:43 compute-1 ceph-mon[80126]: 9.2 scrub starts
Jan 23 09:52:43 compute-1 ceph-mon[80126]: 9.2 scrub ok
Jan 23 09:52:43 compute-1 ceph-mon[80126]: Bind address in nfs.cephfs.0.0.compute-1.bawllm's ganesha conf is defaulting to empty
Jan 23 09:52:43 compute-1 ceph-mon[80126]: Deploying daemon nfs.cephfs.0.0.compute-1.bawllm on compute-1
Jan 23 09:52:43 compute-1 ceph-mon[80126]: 9.16 scrub starts
Jan 23 09:52:43 compute-1 ceph-mon[80126]: 9.16 scrub ok
Jan 23 09:52:43 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pg_num_actual", "val": "32"}]': finished
Jan 23 09:52:43 compute-1 ceph-mon[80126]: osdmap e61: 3 total, 3 up, 3 in
Jan 23 09:52:43 compute-1 ceph-mon[80126]: mds.? [v2:192.168.122.100:6806/3718923574,v1:192.168.122.100:6807/3718923574] up:standby
Jan 23 09:52:43 compute-1 ceph-mon[80126]: mds.? [v2:192.168.122.101:6804/2199615937,v1:192.168.122.101:6805/2199615937] up:standby
Jan 23 09:52:43 compute-1 ceph-mon[80126]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 2 up:standby
Jan 23 09:52:43 compute-1 ceph-mon[80126]: 9.a scrub starts
Jan 23 09:52:43 compute-1 ceph-mon[80126]: 9.a scrub ok
Jan 23 09:52:43 compute-1 systemd[1]: Reloading.
Jan 23 09:52:43 compute-1 systemd-rc-local-generator[84802]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:43 compute-1 systemd-sysv-generator[84805]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:43 compute-1 systemd[1]: Reloading.
Jan 23 09:52:43 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.6 deep-scrub starts
Jan 23 09:52:43 compute-1 systemd-rc-local-generator[84842]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:52:43 compute-1 systemd-sysv-generator[84847]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:52:43 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.6 deep-scrub ok
Jan 23 09:52:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e62 e62: 3 total, 3 up, 3 in
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.10( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.13( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.12( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.15( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.4( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.7( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.6( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.9( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.11( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.8( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.c( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.f( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.b( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.a( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.e( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.d( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.5( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.2( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.3( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1e( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1f( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1c( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1a( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1b( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.18( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.19( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.16( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.17( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.14( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1d( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1( empty local-lis/les=52/53 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:43 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.10( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.13( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.15( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.12( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.4( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.7( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.11( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.6( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.8( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.9( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.b( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.d( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.5( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.0( empty local-lis/les=61/62 n=0 ec=52/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.2( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.3( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.f( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1f( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.18( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1b( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.16( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.14( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1d( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.17( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.1( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 62 pg[12.19( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=52/52 les/c/f=53/53/0 sis=61) [0] r=0 lpr=61 pi=[52,61)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:44 compute-1 podman[84897]: 2026-01-23 09:52:44.130299891 +0000 UTC m=+0.075700800 container create 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:52:44 compute-1 podman[84897]: 2026-01-23 09:52:44.078908509 +0000 UTC m=+0.024309438 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:52:44 compute-1 ceph-mon[80126]: 9.17 scrub starts
Jan 23 09:52:44 compute-1 ceph-mon[80126]: 9.17 scrub ok
Jan 23 09:52:44 compute-1 ceph-mon[80126]: pgmap v65: 353 pgs: 93 unknown, 260 active+clean; 456 KiB data, 85 MiB used, 60 GiB / 60 GiB avail; 211 KiB/s rd, 0 B/s wr, 356 op/s
Jan 23 09:52:44 compute-1 ceph-mon[80126]: 9.6 deep-scrub starts
Jan 23 09:52:44 compute-1 ceph-mon[80126]: 9.6 deep-scrub ok
Jan 23 09:52:44 compute-1 ceph-mon[80126]: osdmap e62: 3 total, 3 up, 3 in
Jan 23 09:52:44 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.11 scrub starts
Jan 23 09:52:44 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.11 scrub ok
Jan 23 09:52:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:52:44 compute-1 podman[84897]: 2026-01-23 09:52:44.638376327 +0000 UTC m=+0.583777326 container init 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:52:44 compute-1 podman[84897]: 2026-01-23 09:52:44.645424895 +0000 UTC m=+0.590825844 container start 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Jan 23 09:52:44 compute-1 bash[84897]: 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 09:52:44 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:52:44 compute-1 sudo[84675]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:52:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:52:45 compute-1 ceph-mon[80126]: 8.3 scrub starts
Jan 23 09:52:45 compute-1 ceph-mon[80126]: 8.3 scrub ok
Jan 23 09:52:45 compute-1 ceph-mon[80126]: 9.11 scrub starts
Jan 23 09:52:45 compute-1 ceph-mon[80126]: 9.11 scrub ok
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 09:52:45 compute-1 ceph-mon[80126]: 8.15 scrub starts
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 09:52:45 compute-1 ceph-mon[80126]: 8.15 scrub ok
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]: dispatch
Jan 23 09:52:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:52:45 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.f scrub starts
Jan 23 09:52:45 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.f scrub ok
Jan 23 09:52:46 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.10 deep-scrub starts
Jan 23 09:52:46 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.10 deep-scrub ok
Jan 23 09:52:46 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e63 e63: 3 total, 3 up, 3 in
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.11( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.195859909s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157989502s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.11( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.195804596s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157989502s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.10( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.195192337s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157638550s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.10( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.195178032s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157638550s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.617496490s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.580047607s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.617434502s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.580047607s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.13( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194779396s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157638550s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.13( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194762230s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157638550s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.12( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194700241s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157760620s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.12( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194680214s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157760620s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616744041s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.579940796s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616715431s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.579940796s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616385460s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.580001831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616312027s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'768 lcod 62'767 mlcod 62'767 active pruub 185.580001831s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616270065s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'768 lcod 62'767 mlcod 0'0 unknown NOTIFY pruub 185.580001831s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.7( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194222450s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157913208s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.7( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194145203s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157913208s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.6( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194022179s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157989502s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.6( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.194001198s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157989502s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.615398407s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 185.579849243s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.615350723s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 185.579849243s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.9( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193523407s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158020020s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193923950s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158660889s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.616319656s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.580001831s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193905830s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158660889s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.8( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193208694s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158004761s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.8( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193188667s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158004761s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.9( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193440437s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158020020s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193057060s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158050537s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193034172s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158050537s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.614524841s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 185.579772949s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.b( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193387985s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158676147s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.614482880s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 185.579772949s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.b( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193368912s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158676147s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.614227295s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'768 lcod 62'767 mlcod 62'767 active pruub 185.579681396s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.614196777s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'768 lcod 62'767 mlcod 0'0 unknown NOTIFY pruub 185.579681396s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193170547s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158721924s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.193154335s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158721924s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.4( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192028046s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.157821655s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613624573s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.579666138s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613601685s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.579666138s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613553047s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'765 lcod 62'764 mlcod 62'764 active pruub 185.579681396s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613502502s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'765 lcod 62'764 mlcod 0'0 unknown NOTIFY pruub 185.579681396s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.2( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192445755s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158721924s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.2( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192432404s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158721924s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613152504s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'766 lcod 62'765 mlcod 62'765 active pruub 185.579589844s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.3( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192131996s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158721924s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.613007545s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'766 lcod 62'765 mlcod 0'0 unknown NOTIFY pruub 185.579589844s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.3( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.192109108s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158721924s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.4( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191854477s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.157821655s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191747665s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.158737183s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.612077713s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 185.579498291s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1e( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191413879s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.158737183s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191987038s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159423828s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1c( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191963196s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159423828s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191950798s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159484863s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1a( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191933632s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159484863s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611672401s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 185.579452515s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.19( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191948891s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159820557s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.18( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191813469s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159683228s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.19( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191926956s) [1] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159820557s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611551285s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 185.579452515s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.18( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191771507s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159683228s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611083031s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'761 lcod 62'760 mlcod 62'760 active pruub 185.579376221s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611630440s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 185.579498291s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.611034393s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'761 lcod 62'760 mlcod 0'0 unknown NOTIFY pruub 185.579376221s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.17( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191430092s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159805298s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.17( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.191412926s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159805298s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610406876s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.579299927s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610637665s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=60'756 lcod 60'755 mlcod 60'755 active pruub 185.579574585s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610358238s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.579299927s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610615730s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=60'756 lcod 60'755 mlcod 0'0 unknown NOTIFY pruub 185.579574585s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610351562s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'769 lcod 62'768 mlcod 62'768 active pruub 185.579330444s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=63 pruub=10.610065460s) [2] r=-1 lpr=63 pi=[59,63)/1 crt=62'769 lcod 62'768 mlcod 0'0 unknown NOTIFY pruub 185.579330444s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1d( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.190383911s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 active pruub 188.159881592s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[12.1d( empty local-lis/les=61/62 n=0 ec=61/52 lis/c=61/61 les/c/f=62/62/0 sis=63 pruub=13.190237999s) [2] r=-1 lpr=63 pi=[61,63)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 188.159881592s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.14( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.12( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.f( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.4( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.5( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.7( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1a( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1c( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1d( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1b( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 63 pg[11.1e( empty local-lis/les=0/0 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:46 compute-1 ceph-mon[80126]: Creating key for client.nfs.cephfs.1.0.compute-2.tykohi
Jan 23 09:52:46 compute-1 ceph-mon[80126]: Ensuring nfs.cephfs.1 is in the ganesha grace table
Jan 23 09:52:46 compute-1 ceph-mon[80126]: pgmap v67: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 28 KiB/s rd, 0 B/s wr, 57 op/s; 105 B/s, 0 objects/s recovering
Jan 23 09:52:46 compute-1 ceph-mon[80126]: 9.f scrub starts
Jan 23 09:52:46 compute-1 ceph-mon[80126]: 9.f scrub ok
Jan 23 09:52:47 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.d scrub starts
Jan 23 09:52:47 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.d scrub ok
Jan 23 09:52:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e63 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e64 e64: 3 total, 3 up, 3 in
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=60'756 lcod 60'755 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=60'756 lcod 60'755 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'761 lcod 62'760 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'761 lcod 62'760 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'769 lcod 62'768 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'769 lcod 62'768 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'766 lcod 62'765 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'766 lcod 62'765 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'765 lcod 62'764 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'765 lcod 62'764 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.692672729s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 185.580017090s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.692641258s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 185.580017090s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.692136765s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 185.580047607s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.692108154s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 185.580047607s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691969872s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 185.579925537s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691805840s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'773 lcod 62'772 mlcod 62'772 active pruub 185.579803467s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691919327s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 185.579925537s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691760063s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'773 lcod 62'772 mlcod 0'0 unknown NOTIFY pruub 185.579803467s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691308022s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'763 lcod 62'762 mlcod 62'762 active pruub 185.579742432s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.691273689s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'763 lcod 62'762 mlcod 0'0 unknown NOTIFY pruub 185.579742432s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690748215s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=61'756 lcod 61'755 mlcod 61'755 active pruub 185.579467773s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690720558s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=61'756 lcod 61'755 mlcod 0'0 unknown NOTIFY pruub 185.579467773s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690556526s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'763 lcod 62'762 mlcod 62'762 active pruub 185.579376221s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690526962s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=62'763 lcod 62'762 mlcod 0'0 unknown NOTIFY pruub 185.579376221s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690239906s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=61'760 lcod 61'759 mlcod 61'759 active pruub 185.579360962s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64 pruub=9.690199852s) [1] r=-1 lpr=64 pi=[59,64)/1 crt=61'760 lcod 61'759 mlcod 0'0 unknown NOTIFY pruub 185.579360962s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1a( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1e( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1c( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1b( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1d( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.7( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.5( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.4( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.f( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.1( v 51'48 (0'0,51'48] local-lis/les=63/64 n=1 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.12( v 51'48 (0'0,51'48] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=51'48 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 64 pg[11.14( v 60'51 lc 50'43 (0'0,60'51] local-lis/les=63/64 n=0 ec=59/49 lis/c=59/59 les/c/f=60/60/0 sis=63) [0] r=0 lpr=63 pi=[59,63)/1 crt=60'51 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(0+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-0000000000000001:nfs.cephfs.0: -2
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 09:52:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:47 : epoch 697344ec : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:52:47 compute-1 ceph-mon[80126]: 9.13 deep-scrub starts
Jan 23 09:52:47 compute-1 ceph-mon[80126]: 9.13 deep-scrub ok
Jan 23 09:52:47 compute-1 ceph-mon[80126]: 8.10 deep-scrub starts
Jan 23 09:52:47 compute-1 ceph-mon[80126]: 8.10 deep-scrub ok
Jan 23 09:52:47 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.control", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:52:47 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "2"}]': finished
Jan 23 09:52:47 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.meta", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:52:47 compute-1 ceph-mon[80126]: osdmap e63: 3 total, 3 up, 3 in
Jan 23 09:52:47 compute-1 ceph-mon[80126]: 9.9 scrub starts
Jan 23 09:52:47 compute-1 ceph-mon[80126]: 9.9 scrub ok
Jan 23 09:52:47 compute-1 ceph-mon[80126]: pgmap v69: 353 pgs: 353 active+clean; 456 KiB data, 103 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 0 B/s wr, 49 op/s; 260 B/s, 1 objects/s recovering
Jan 23 09:52:47 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]: dispatch
Jan 23 09:52:47 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "3"}]': finished
Jan 23 09:52:47 compute-1 ceph-mon[80126]: osdmap e64: 3 total, 3 up, 3 in
Jan 23 09:52:48 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.10 scrub starts
Jan 23 09:52:48 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.10 scrub ok
Jan 23 09:52:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e65 e65: 3 total, 3 up, 3 in
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=61'756 lcod 61'755 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=61'756 lcod 61'755 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=61'760 lcod 61'759 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=59/60 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=61'760 lcod 61'759 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'773 lcod 62'772 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'773 lcod 62'772 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=62'759 lcod 62'758 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] r=0 lpr=65 pi=[59,65)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'765 lcod 62'764 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'769 lcod 62'768 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=64/65 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=58'754 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=64/65 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.5( v 62'766 (0'0,62'766] local-lis/les=64/65 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'766 lcod 62'765 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=64/65 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=60'756 lcod 60'755 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'771 lcod 62'770 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=64/65 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'768 lcod 62'767 mlcod 0'0 active+remapped mbc={255={(0+1)=8}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=64/65 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'761 lcod 62'760 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:48 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 65 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=64/65 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=64) [2]/[0] async=[2] r=0 lpr=64 pi=[59,64)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:49 compute-1 ceph-mon[80126]: 11.15 scrub starts
Jan 23 09:52:49 compute-1 ceph-mon[80126]: 11.15 scrub ok
Jan 23 09:52:49 compute-1 ceph-mon[80126]: 9.d scrub starts
Jan 23 09:52:49 compute-1 ceph-mon[80126]: 9.d scrub ok
Jan 23 09:52:49 compute-1 ceph-mon[80126]: 9.18 scrub starts
Jan 23 09:52:49 compute-1 ceph-mon[80126]: 9.18 scrub ok
Jan 23 09:52:49 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 09:52:49 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 09:52:49 compute-1 ceph-mon[80126]: Rados config object exists: conf-nfs.cephfs
Jan 23 09:52:49 compute-1 ceph-mon[80126]: Creating key for client.nfs.cephfs.1.0.compute-2.tykohi-rgw
Jan 23 09:52:49 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:49 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.1.0.compute-2.tykohi-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:49 compute-1 ceph-mon[80126]: Bind address in nfs.cephfs.1.0.compute-2.tykohi's ganesha conf is defaulting to empty
Jan 23 09:52:49 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:49 compute-1 ceph-mon[80126]: Deploying daemon nfs.cephfs.1.0.compute-2.tykohi on compute-2
Jan 23 09:52:49 compute-1 ceph-mon[80126]: osdmap e65: 3 total, 3 up, 3 in
Jan 23 09:52:50 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e66 e66: 3 total, 3 up, 3 in
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.873781204s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 193.074768066s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.873719215s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 193.074768066s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.873208046s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'768 lcod 62'767 mlcod 62'767 active pruub 193.074890137s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1( v 62'768 (0'0,62'768] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.873147011s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'768 lcod 62'767 mlcod 0'0 unknown NOTIFY pruub 193.074890137s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.872499466s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 193.074386597s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.872417450s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 193.074386597s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=64/65 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.872388840s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 193.074752808s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=64/65 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.872339249s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 193.074752808s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.693450928s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 192.895935059s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871794701s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'765 lcod 62'764 mlcod 62'764 active pruub 193.074371338s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.693422318s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 192.895935059s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.3( v 62'765 (0'0,62'765] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871710777s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'765 lcod 62'764 mlcod 0'0 unknown NOTIFY pruub 193.074371338s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871749878s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 193.074874878s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871699333s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'766 lcod 65'769 mlcod 65'769 active pruub 193.074935913s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871006012s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'769 lcod 62'768 mlcod 62'768 active pruub 193.074508667s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.11( v 62'769 (0'0,62'769] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.870960236s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'769 lcod 62'768 mlcod 0'0 unknown NOTIFY pruub 193.074508667s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871294975s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 193.074935913s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=64/65 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871304512s) [2] async=[2] r=-1 lpr=66 pi=[59,66)/1 crt=60'756 lcod 60'755 mlcod 60'755 active pruub 193.074981689s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871256828s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 193.074935913s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=64/65 n=4 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871284485s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=60'756 lcod 60'755 mlcod 0'0 unknown NOTIFY pruub 193.074981689s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.5( v 65'770 (0'0,65'770] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871036530s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'766 lcod 65'769 mlcod 0'0 unknown NOTIFY pruub 193.074935913s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.13( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=66 pruub=14.871041298s) [2] r=-1 lpr=66 pi=[59,66)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 193.074874878s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:50 compute-1 ceph-mon[80126]: 8.e scrub starts
Jan 23 09:52:50 compute-1 ceph-mon[80126]: 8.e scrub ok
Jan 23 09:52:50 compute-1 ceph-mon[80126]: 9.10 scrub starts
Jan 23 09:52:50 compute-1 ceph-mon[80126]: 9.10 scrub ok
Jan 23 09:52:50 compute-1 ceph-mon[80126]: pgmap v72: 353 pgs: 1 active+recovering+remapped, 1 active+remapped, 8 remapped+peering, 14 active+recovery_wait+remapped, 329 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 58 KiB/s rd, 1.3 KiB/s wr, 111 op/s; 80/223 objects misplaced (35.874%); 227 B/s, 1 objects/s recovering
Jan 23 09:52:50 compute-1 ceph-mon[80126]: 9.c scrub starts
Jan 23 09:52:50 compute-1 ceph-mon[80126]: 9.c scrub ok
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=65/66 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=58'754 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=65/66 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=65/66 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'773 lcod 62'772 mlcod 0'0 active+remapped mbc={255={(0+1)=9}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=65/66 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 active+remapped mbc={255={(0+1)=6}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=65/66 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=61'756 lcod 61'755 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=65/66 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=61'760 lcod 61'759 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=65/66 n=5 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'763 lcod 62'762 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:50 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 66 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=65/66 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=65) [1]/[0] async=[1] r=0 lpr=65 pi=[59,65)/1 crt=62'759 lcod 62'758 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:52:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:52:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:52:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:52:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:51 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:52:51 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e67 e67: 3 total, 3 up, 3 in
Jan 23 09:52:51 compute-1 ceph-mon[80126]: osdmap e66: 3 total, 3 up, 3 in
Jan 23 09:52:51 compute-1 ceph-mon[80126]: 8.1 deep-scrub starts
Jan 23 09:52:51 compute-1 ceph-mon[80126]: 8.1 deep-scrub ok
Jan 23 09:52:51 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:51 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:51 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:51 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]: dispatch
Jan 23 09:52:51 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu", "caps": ["mon", "allow r", "osd", "allow rw pool=.nfs namespace=cephfs"]}]': finished
Jan 23 09:52:51 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]: dispatch
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.107666969s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 194.456970215s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.725880623s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 193.075225830s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.107567787s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 194.456970215s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.108106613s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'759 lcod 62'758 mlcod 62'758 active pruub 194.457885742s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.107128143s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 194.457000732s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.2( v 62'759 (0'0,62'759] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.108001709s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 194.457885742s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.107074738s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 194.457000732s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.15( v 62'759 (0'0,62'759] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.725820541s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'759 lcod 62'758 mlcod 0'0 unknown NOTIFY pruub 193.075225830s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=65/66 n=7 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106925964s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'773 lcod 62'772 mlcod 62'772 active pruub 194.457015991s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=65/66 n=7 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106858253s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'773 lcod 62'772 mlcod 0'0 unknown NOTIFY pruub 194.457015991s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.724931717s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'768 lcod 62'767 mlcod 62'767 active pruub 193.075210571s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.724854469s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'768 lcod 62'767 mlcod 0'0 unknown NOTIFY pruub 193.075210571s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.724407196s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 193.075180054s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=64/65 n=7 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.724274635s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 193.075180054s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106864929s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'763 lcod 62'762 mlcod 62'762 active pruub 194.457885742s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106761932s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=61'756 lcod 61'755 mlcod 61'755 active pruub 194.458007812s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.723976135s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 193.075302124s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106699944s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=61'756 lcod 61'755 mlcod 0'0 unknown NOTIFY pruub 194.458007812s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=65/66 n=5 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106610298s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=62'763 lcod 62'762 mlcod 62'762 active pruub 194.458038330s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=65/66 n=6 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106502533s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'763 lcod 62'762 mlcod 0'0 unknown NOTIFY pruub 194.457885742s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=64/65 n=6 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.723916054s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 193.075302124s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=65/66 n=5 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106540680s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=62'763 lcod 62'762 mlcod 0'0 unknown NOTIFY pruub 194.458038330s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.723504066s) [2] async=[2] r=-1 lpr=67 pi=[59,67)/1 crt=62'761 lcod 62'760 mlcod 62'760 active pruub 193.075271606s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106259346s) [1] async=[1] r=-1 lpr=67 pi=[59,67)/1 crt=61'760 lcod 61'759 mlcod 61'759 active pruub 194.458068848s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.12( v 61'760 (0'0,61'760] local-lis/les=65/66 n=4 ec=59/46 lis/c=65/59 les/c/f=66/60/0 sis=67 pruub=15.106206894s) [1] r=-1 lpr=67 pi=[59,67)/1 crt=61'760 lcod 61'759 mlcod 0'0 unknown NOTIFY pruub 194.458068848s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:51 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 67 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=64/65 n=5 ec=59/46 lis/c=64/59 les/c/f=65/60/0 sis=67 pruub=13.723349571s) [2] r=-1 lpr=67 pi=[59,67)/1 crt=62'761 lcod 62'760 mlcod 0'0 unknown NOTIFY pruub 193.075271606s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:52:52 compute-1 ceph-mon[80126]: Creating key for client.nfs.cephfs.2.0.compute-0.fenqiu
Jan 23 09:52:52 compute-1 ceph-mon[80126]: Ensuring nfs.cephfs.2 is in the ganesha grace table
Jan 23 09:52:52 compute-1 ceph-mon[80126]: pgmap v74: 353 pgs: 11 peering, 8 remapped+peering, 5 active+recovery_wait+remapped, 329 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 1.6 KiB/s wr, 69 op/s; 29/223 objects misplaced (13.004%); 240 B/s, 13 objects/s recovering
Jan 23 09:52:52 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.mgr.nfs.grace.nfs.cephfs", "caps": ["mon", "allow r", "osd", "allow rwx pool .nfs"]}]': finished
Jan 23 09:52:52 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:52 compute-1 ceph-mon[80126]: osdmap e67: 3 total, 3 up, 3 in
Jan 23 09:52:52 compute-1 ceph-mon[80126]: 9.0 scrub starts
Jan 23 09:52:52 compute-1 ceph-mon[80126]: 9.0 scrub ok
Jan 23 09:52:52 compute-1 ceph-mon[80126]: 10.17 scrub starts
Jan 23 09:52:52 compute-1 ceph-mon[80126]: 10.17 scrub ok
Jan 23 09:52:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e68 e68: 3 total, 3 up, 3 in
Jan 23 09:52:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:53 compute-1 ceph-mon[80126]: 9.1 deep-scrub starts
Jan 23 09:52:53 compute-1 ceph-mon[80126]: 9.1 deep-scrub ok
Jan 23 09:52:53 compute-1 ceph-mon[80126]: osdmap e68: 3 total, 3 up, 3 in
Jan 23 09:52:53 compute-1 ceph-mon[80126]: 10.7 scrub starts
Jan 23 09:52:53 compute-1 ceph-mon[80126]: 10.7 scrub ok
Jan 23 09:52:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:54 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:52:54 compute-1 ceph-mon[80126]: pgmap v77: 353 pgs: 11 peering, 8 remapped+peering, 5 active+recovery_wait+remapped, 329 active+clean; 456 KiB data, 121 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 1.6 KiB/s wr, 68 op/s; 29/223 objects misplaced (13.004%); 239 B/s, 12 objects/s recovering
Jan 23 09:52:54 compute-1 ceph-mon[80126]: 8.0 scrub starts
Jan 23 09:52:54 compute-1 ceph-mon[80126]: 8.0 scrub ok
Jan 23 09:52:54 compute-1 ceph-mon[80126]: 10.5 scrub starts
Jan 23 09:52:54 compute-1 ceph-mon[80126]: 10.5 scrub ok
Jan 23 09:52:54 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]: dispatch
Jan 23 09:52:54 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth rm", "entity": "client.mgr.nfs.grace.nfs.cephfs"}]': finished
Jan 23 09:52:54 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]: dispatch
Jan 23 09:52:54 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "auth get-or-create", "entity": "client.nfs.cephfs.2.0.compute-0.fenqiu-rgw", "caps": ["mon", "allow r", "osd", "allow rwx tag rgw *=*"]}]': finished
Jan 23 09:52:54 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:52:54 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.17 deep-scrub starts
Jan 23 09:52:54 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.17 deep-scrub ok
Jan 23 09:52:55 compute-1 ceph-mon[80126]: 8.7 scrub starts
Jan 23 09:52:55 compute-1 ceph-mon[80126]: 8.7 scrub ok
Jan 23 09:52:55 compute-1 ceph-mon[80126]: Rados config object exists: conf-nfs.cephfs
Jan 23 09:52:55 compute-1 ceph-mon[80126]: Creating key for client.nfs.cephfs.2.0.compute-0.fenqiu-rgw
Jan 23 09:52:55 compute-1 ceph-mon[80126]: Bind address in nfs.cephfs.2.0.compute-0.fenqiu's ganesha conf is defaulting to empty
Jan 23 09:52:55 compute-1 ceph-mon[80126]: Deploying daemon nfs.cephfs.2.0.compute-0.fenqiu on compute-0
Jan 23 09:52:55 compute-1 ceph-mon[80126]: 9.3 scrub starts
Jan 23 09:52:55 compute-1 ceph-mon[80126]: 9.3 scrub ok
Jan 23 09:52:55 compute-1 ceph-mon[80126]: 8.17 deep-scrub starts
Jan 23 09:52:55 compute-1 ceph-mon[80126]: 8.17 deep-scrub ok
Jan 23 09:52:55 compute-1 ceph-mon[80126]: pgmap v78: 353 pgs: 11 peering, 5 active+recovery_wait+remapped, 337 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 1.3 KiB/s wr, 3 op/s; 29/222 objects misplaced (13.063%); 318 B/s, 15 objects/s recovering
Jan 23 09:52:55 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.e scrub starts
Jan 23 09:52:55 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.e scrub ok
Jan 23 09:52:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:56 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:52:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:56 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:52:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:56 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:52:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:52:56 : epoch 697344ec : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:52:56 compute-1 sudo[84967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:52:56 compute-1 sudo[84967]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:56 compute-1 sudo[84967]: pam_unix(sudo:session): session closed for user root
Jan 23 09:52:56 compute-1 sudo[84992]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/haproxy:2.3 --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:52:56 compute-1 sudo[84992]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:52:56 compute-1 ceph-mon[80126]: 9.4 deep-scrub starts
Jan 23 09:52:56 compute-1 ceph-mon[80126]: 9.4 deep-scrub ok
Jan 23 09:52:56 compute-1 ceph-mon[80126]: 8.1f scrub starts
Jan 23 09:52:56 compute-1 ceph-mon[80126]: 8.1f scrub ok
Jan 23 09:52:56 compute-1 ceph-mon[80126]: 9.e scrub starts
Jan 23 09:52:56 compute-1 ceph-mon[80126]: 9.e scrub ok
Jan 23 09:52:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:52:56 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.15 scrub starts
Jan 23 09:52:56 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.15 scrub ok
Jan 23 09:52:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e68 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:52:57 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.1b scrub starts
Jan 23 09:52:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e69 e69: 3 total, 3 up, 3 in
Jan 23 09:52:57 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.1b scrub ok
Jan 23 09:52:57 compute-1 ceph-mon[80126]: 9.1a scrub starts
Jan 23 09:52:57 compute-1 ceph-mon[80126]: 9.1a scrub ok
Jan 23 09:52:57 compute-1 ceph-mon[80126]: 8.5 deep-scrub starts
Jan 23 09:52:57 compute-1 ceph-mon[80126]: 8.5 deep-scrub ok
Jan 23 09:52:57 compute-1 ceph-mon[80126]: Deploying daemon haproxy.nfs.cephfs.compute-1.mnxlgm on compute-1
Jan 23 09:52:57 compute-1 ceph-mon[80126]: 9.15 scrub starts
Jan 23 09:52:57 compute-1 ceph-mon[80126]: 9.15 scrub ok
Jan 23 09:52:57 compute-1 ceph-mon[80126]: pgmap v79: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 296 B/s, 15 objects/s recovering
Jan 23 09:52:57 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]: dispatch
Jan 23 09:52:58 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.4 scrub starts
Jan 23 09:52:58 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.4 scrub ok
Jan 23 09:52:58 compute-1 ceph-mon[80126]: 9.1b scrub starts
Jan 23 09:52:58 compute-1 ceph-mon[80126]: 9.1b scrub ok
Jan 23 09:52:58 compute-1 ceph-mon[80126]: 8.2 deep-scrub starts
Jan 23 09:52:58 compute-1 ceph-mon[80126]: 8.2 deep-scrub ok
Jan 23 09:52:58 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "4"}]': finished
Jan 23 09:52:58 compute-1 ceph-mon[80126]: osdmap e69: 3 total, 3 up, 3 in
Jan 23 09:52:58 compute-1 ceph-mon[80126]: 8.1b scrub starts
Jan 23 09:52:58 compute-1 ceph-mon[80126]: 8.1b scrub ok
Jan 23 09:52:59 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.18 scrub starts
Jan 23 09:52:59 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.18 scrub ok
Jan 23 09:52:59 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e70 e70: 3 total, 3 up, 3 in
Jan 23 09:52:59 compute-1 ceph-mon[80126]: 8.1a scrub starts
Jan 23 09:52:59 compute-1 ceph-mon[80126]: 8.1a scrub ok
Jan 23 09:52:59 compute-1 ceph-mon[80126]: 8.6 scrub starts
Jan 23 09:52:59 compute-1 ceph-mon[80126]: 8.6 scrub ok
Jan 23 09:52:59 compute-1 ceph-mon[80126]: 8.4 scrub starts
Jan 23 09:52:59 compute-1 ceph-mon[80126]: 8.4 scrub ok
Jan 23 09:52:59 compute-1 ceph-mon[80126]: pgmap v81: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 92 B/s, 6 objects/s recovering
Jan 23 09:52:59 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]: dispatch
Jan 23 09:52:59 compute-1 podman[85054]: 2026-01-23 09:52:59.950148082 +0000 UTC m=+2.980830786 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 09:53:00 compute-1 podman[85054]: 2026-01-23 09:53:00.076650693 +0000 UTC m=+3.107333417 container create 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 09:53:00 compute-1 systemd[1]: Started libpod-conmon-751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb.scope.
Jan 23 09:53:00 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:53:00 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.141506195s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 201.580200195s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:00 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.141422272s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 201.580200195s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:00 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140943527s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 201.579940796s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:00 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140881538s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 201.579940796s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:00 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140601158s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 201.579818726s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:00 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140559196s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 201.579818726s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:00 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140618324s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 201.580184937s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:00 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 70 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=70 pruub=13.140561104s) [2] r=-1 lpr=70 pi=[59,70)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 201.580184937s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:00 compute-1 podman[85054]: 2026-01-23 09:53:00.363974641 +0000 UTC m=+3.394657345 container init 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 09:53:00 compute-1 podman[85054]: 2026-01-23 09:53:00.37139877 +0000 UTC m=+3.402081454 container start 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 09:53:00 compute-1 jolly_mendel[85171]: 0 0
Jan 23 09:53:00 compute-1 systemd[1]: libpod-751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb.scope: Deactivated successfully.
Jan 23 09:53:00 compute-1 conmon[85171]: conmon 751768b7deabe0750f34 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb.scope/container/memory.events
Jan 23 09:53:00 compute-1 sshd-session[85168]: Invalid user sol from 45.148.10.240 port 55330
Jan 23 09:53:00 compute-1 podman[85054]: 2026-01-23 09:53:00.430442946 +0000 UTC m=+3.461125640 container attach 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 09:53:00 compute-1 podman[85054]: 2026-01-23 09:53:00.430856867 +0000 UTC m=+3.461539551 container died 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 09:53:00 compute-1 sshd-session[85168]: Connection closed by invalid user sol 45.148.10.240 port 55330 [preauth]
Jan 23 09:53:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-7ea2d98c86d410c1127884d606a7dd74545b4518f1c81071a01d562c8a92fdce-merged.mount: Deactivated successfully.
Jan 23 09:53:00 compute-1 podman[85054]: 2026-01-23 09:53:00.58857233 +0000 UTC m=+3.619255024 container remove 751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb (image=quay.io/ceph/haproxy:2.3, name=jolly_mendel)
Jan 23 09:53:00 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.12 scrub starts
Jan 23 09:53:00 compute-1 systemd[1]: libpod-conmon-751768b7deabe0750f34a576851ba564ec0bd8c254932952fdf62c83584644fb.scope: Deactivated successfully.
Jan 23 09:53:00 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.12 scrub ok
Jan 23 09:53:00 compute-1 systemd[1]: Reloading.
Jan 23 09:53:01 compute-1 systemd-rc-local-generator[85219]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:01 compute-1 systemd-sysv-generator[85223]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:01 compute-1 ceph-mon[80126]: 9.19 scrub starts
Jan 23 09:53:01 compute-1 ceph-mon[80126]: 9.19 scrub ok
Jan 23 09:53:01 compute-1 ceph-mon[80126]: 8.11 scrub starts
Jan 23 09:53:01 compute-1 ceph-mon[80126]: 8.11 scrub ok
Jan 23 09:53:01 compute-1 ceph-mon[80126]: 8.18 scrub starts
Jan 23 09:53:01 compute-1 ceph-mon[80126]: 8.18 scrub ok
Jan 23 09:53:01 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "5"}]': finished
Jan 23 09:53:01 compute-1 ceph-mon[80126]: osdmap e70: 3 total, 3 up, 3 in
Jan 23 09:53:01 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:01 compute-1 ceph-mon[80126]: 8.12 scrub starts
Jan 23 09:53:01 compute-1 ceph-mon[80126]: 8.12 scrub ok
Jan 23 09:53:01 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e71 e71: 3 total, 3 up, 3 in
Jan 23 09:53:01 compute-1 systemd[1]: Reloading.
Jan 23 09:53:01 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:01 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:01 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:01 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:01 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:01 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:01 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=59/60 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'771 lcod 62'770 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:01 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 71 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:01 compute-1 systemd-rc-local-generator[85261]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:01 compute-1 systemd-sysv-generator[85265]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:01 compute-1 systemd[1]: Starting Ceph haproxy.nfs.cephfs.compute-1.mnxlgm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:53:01 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.12 scrub starts
Jan 23 09:53:01 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 9.12 scrub ok
Jan 23 09:53:01 compute-1 podman[85318]: 2026-01-23 09:53:01.803468227 +0000 UTC m=+0.068990722 container create e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 09:53:01 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33f456f7f8080aaf0dc818d727e8500103e4388c03753b454f47838de1ecfb4a/merged/var/lib/haproxy supports timestamps until 2038 (0x7fffffff)
Jan 23 09:53:01 compute-1 podman[85318]: 2026-01-23 09:53:01.75674544 +0000 UTC m=+0.022267955 image pull e85424b0d443f37ddd2dd8a3bb2ef6f18dd352b987723a921b64289023af2914 quay.io/ceph/haproxy:2.3
Jan 23 09:53:02 compute-1 podman[85318]: 2026-01-23 09:53:02.068273654 +0000 UTC m=+0.333796159 container init e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 09:53:02 compute-1 podman[85318]: 2026-01-23 09:53:02.074144472 +0000 UTC m=+0.339666967 container start e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 09:53:02 compute-1 bash[85318]: e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f
Jan 23 09:53:02 compute-1 systemd[1]: Started Ceph haproxy.nfs.cephfs.compute-1.mnxlgm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:53:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [NOTICE] 022/095302 (2) : New worker #1 (4) forked
Jan 23 09:53:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:02 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e4000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:02 compute-1 sudo[84992]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:02 compute-1 ceph-mon[80126]: 9.1e scrub starts
Jan 23 09:53:02 compute-1 ceph-mon[80126]: 9.1e scrub ok
Jan 23 09:53:02 compute-1 ceph-mon[80126]: 8.b scrub starts
Jan 23 09:53:02 compute-1 ceph-mon[80126]: 8.b scrub ok
Jan 23 09:53:02 compute-1 ceph-mon[80126]: pgmap v83: 353 pgs: 4 unknown, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 90 B/s, 6 objects/s recovering
Jan 23 09:53:02 compute-1 ceph-mon[80126]: osdmap e71: 3 total, 3 up, 3 in
Jan 23 09:53:02 compute-1 ceph-mon[80126]: 9.1f scrub starts
Jan 23 09:53:02 compute-1 ceph-mon[80126]: 9.1f scrub ok
Jan 23 09:53:02 compute-1 ceph-mon[80126]: 9.12 scrub starts
Jan 23 09:53:02 compute-1 ceph-mon[80126]: 9.12 scrub ok
Jan 23 09:53:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e71 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:02 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.19 scrub starts
Jan 23 09:53:02 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.19 scrub ok
Jan 23 09:53:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e72 e72: 3 total, 3 up, 3 in
Jan 23 09:53:02 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 72 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] async=[2] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:02 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 72 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=71/72 n=7 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] async=[2] r=0 lpr=71 pi=[59,71)/1 crt=62'771 lcod 62'770 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:02 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 72 pg[10.4( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] async=[2] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=10}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:02 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 72 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=71) [2]/[0] async=[2] r=0 lpr=71 pi=[59,71)/1 crt=62'764 lcod 62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:03 compute-1 ceph-mon[80126]: 8.c scrub starts
Jan 23 09:53:03 compute-1 ceph-mon[80126]: 8.c scrub ok
Jan 23 09:53:03 compute-1 ceph-mon[80126]: 8.1e scrub starts
Jan 23 09:53:03 compute-1 ceph-mon[80126]: 8.1e scrub ok
Jan 23 09:53:03 compute-1 ceph-mon[80126]: 9.5 deep-scrub starts
Jan 23 09:53:03 compute-1 ceph-mon[80126]: 9.5 deep-scrub ok
Jan 23 09:53:03 compute-1 ceph-mon[80126]: 8.19 scrub starts
Jan 23 09:53:03 compute-1 ceph-mon[80126]: 8.19 scrub ok
Jan 23 09:53:03 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:03 compute-1 ceph-mon[80126]: osdmap e72: 3 total, 3 up, 3 in
Jan 23 09:53:03 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:03 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e73 e73: 3 total, 3 up, 3 in
Jan 23 09:53:03 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.322343826s) [2] async=[2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 207.051757812s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:03 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.322172165s) [2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 207.051757812s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:03 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=71/72 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.321352005s) [2] async=[2] r=-1 lpr=73 pi=[59,73)/1 crt=62'771 lcod 62'770 mlcod 62'770 active pruub 207.051666260s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:03 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.14( v 62'771 (0'0,62'771] local-lis/les=71/72 n=7 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.321168900s) [2] r=-1 lpr=73 pi=[59,73)/1 crt=62'771 lcod 62'770 mlcod 0'0 unknown NOTIFY pruub 207.051666260s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:03 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.320775986s) [2] async=[2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 72'767 mlcod 72'767 active pruub 207.051757812s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:03 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.315814018s) [2] async=[2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 62'763 mlcod 62'763 active pruub 207.047042847s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:03 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.315752983s) [2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 62'763 mlcod 0'0 unknown NOTIFY pruub 207.047042847s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:03 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 73 pg[10.4( v 72'768 (0'0,72'768] local-lis/les=71/72 n=6 ec=59/46 lis/c=71/59 les/c/f=72/60/0 sis=73 pruub=15.319846153s) [2] r=-1 lpr=73 pi=[59,73)/1 crt=62'764 lcod 72'767 mlcod 0'0 unknown NOTIFY pruub 207.051757812s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:03 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.8 scrub starts
Jan 23 09:53:03 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 8.8 scrub ok
Jan 23 09:53:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:04 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8000fb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:04 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.15 deep-scrub starts
Jan 23 09:53:04 compute-1 ceph-mon[80126]: Deploying daemon haproxy.nfs.cephfs.compute-0.yeogal on compute-0
Jan 23 09:53:04 compute-1 ceph-mon[80126]: pgmap v86: 353 pgs: 4 unknown, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:04 compute-1 ceph-mon[80126]: 9.1c scrub starts
Jan 23 09:53:04 compute-1 ceph-mon[80126]: 9.1c scrub ok
Jan 23 09:53:04 compute-1 ceph-mon[80126]: 9.1d scrub starts
Jan 23 09:53:04 compute-1 ceph-mon[80126]: 9.1d scrub ok
Jan 23 09:53:04 compute-1 ceph-mon[80126]: osdmap e73: 3 total, 3 up, 3 in
Jan 23 09:53:04 compute-1 ceph-mon[80126]: 8.8 scrub starts
Jan 23 09:53:04 compute-1 ceph-mon[80126]: 8.8 scrub ok
Jan 23 09:53:04 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.15 deep-scrub ok
Jan 23 09:53:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e74 e74: 3 total, 3 up, 3 in
Jan 23 09:53:05 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.f scrub starts
Jan 23 09:53:05 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.f scrub ok
Jan 23 09:53:05 compute-1 ceph-mon[80126]: 8.1d scrub starts
Jan 23 09:53:05 compute-1 ceph-mon[80126]: 8.1d scrub ok
Jan 23 09:53:05 compute-1 ceph-mon[80126]: 8.1c scrub starts
Jan 23 09:53:05 compute-1 ceph-mon[80126]: 8.1c scrub ok
Jan 23 09:53:05 compute-1 ceph-mon[80126]: 12.15 deep-scrub starts
Jan 23 09:53:05 compute-1 ceph-mon[80126]: 12.15 deep-scrub ok
Jan 23 09:53:05 compute-1 ceph-mon[80126]: osdmap e74: 3 total, 3 up, 3 in
Jan 23 09:53:05 compute-1 ceph-mon[80126]: pgmap v89: 353 pgs: 4 unknown, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:05 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:06 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:06 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.d scrub starts
Jan 23 09:53:06 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.d scrub ok
Jan 23 09:53:06 compute-1 ceph-mon[80126]: 8.13 scrub starts
Jan 23 09:53:06 compute-1 ceph-mon[80126]: 8.13 scrub ok
Jan 23 09:53:06 compute-1 ceph-mon[80126]: 9.b deep-scrub starts
Jan 23 09:53:06 compute-1 ceph-mon[80126]: 9.b deep-scrub ok
Jan 23 09:53:06 compute-1 ceph-mon[80126]: 12.f scrub starts
Jan 23 09:53:06 compute-1 ceph-mon[80126]: 12.f scrub ok
Jan 23 09:53:06 compute-1 ceph-mon[80126]: 12.d scrub starts
Jan 23 09:53:06 compute-1 ceph-mon[80126]: 12.d scrub ok
Jan 23 09:53:07 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.5 scrub starts
Jan 23 09:53:07 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.5 scrub ok
Jan 23 09:53:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e74 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:08 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:08 compute-1 ceph-mon[80126]: 11.0 scrub starts
Jan 23 09:53:08 compute-1 ceph-mon[80126]: 11.0 scrub ok
Jan 23 09:53:08 compute-1 ceph-mon[80126]: 8.f scrub starts
Jan 23 09:53:08 compute-1 ceph-mon[80126]: 8.f scrub ok
Jan 23 09:53:08 compute-1 ceph-mon[80126]: pgmap v90: 353 pgs: 353 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 531 B/s wr, 53 op/s; 80 B/s, 4 objects/s recovering
Jan 23 09:53:08 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]: dispatch
Jan 23 09:53:08 compute-1 ceph-mon[80126]: 12.5 scrub starts
Jan 23 09:53:08 compute-1 ceph-mon[80126]: 12.5 scrub ok
Jan 23 09:53:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e75 e75: 3 total, 3 up, 3 in
Jan 23 09:53:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e76 e76: 3 total, 3 up, 3 in
Jan 23 09:53:08 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.0 scrub starts
Jan 23 09:53:08 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.0 scrub ok
Jan 23 09:53:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:08 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e4001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 11.c scrub starts
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 11.c scrub ok
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 8.9 scrub starts
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 8.9 scrub ok
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 11.b scrub starts
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 11.b scrub ok
Jan 23 09:53:09 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "6"}]': finished
Jan 23 09:53:09 compute-1 ceph-mon[80126]: osdmap e75: 3 total, 3 up, 3 in
Jan 23 09:53:09 compute-1 ceph-mon[80126]: osdmap e76: 3 total, 3 up, 3 in
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 12.0 scrub starts
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 12.0 scrub ok
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 9.8 scrub starts
Jan 23 09:53:09 compute-1 ceph-mon[80126]: 9.8 scrub ok
Jan 23 09:53:09 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:09 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:09 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:09 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]: dispatch
Jan 23 09:53:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e77 e77: 3 total, 3 up, 3 in
Jan 23 09:53:09 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 77 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=77) [0] r=0 lpr=77 pi=[67,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:09 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 77 pg[10.6( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=77) [0] r=0 lpr=77 pi=[67,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:09 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 77 pg[10.e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=77) [0] r=0 lpr=77 pi=[67,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:09 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 77 pg[10.16( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=77) [0] r=0 lpr=77 pi=[67,77)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:09 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1f scrub starts
Jan 23 09:53:09 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1f scrub ok
Jan 23 09:53:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:10 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8001c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:10 compute-1 ceph-mon[80126]: Deploying daemon haproxy.nfs.cephfs.compute-2.bbaqsj on compute-2
Jan 23 09:53:10 compute-1 ceph-mon[80126]: pgmap v93: 353 pgs: 353 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 556 B/s wr, 56 op/s; 84 B/s, 4 objects/s recovering
Jan 23 09:53:10 compute-1 ceph-mon[80126]: 11.9 scrub starts
Jan 23 09:53:10 compute-1 ceph-mon[80126]: 11.9 scrub ok
Jan 23 09:53:10 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "7"}]': finished
Jan 23 09:53:10 compute-1 ceph-mon[80126]: osdmap e77: 3 total, 3 up, 3 in
Jan 23 09:53:10 compute-1 ceph-mon[80126]: 12.1f scrub starts
Jan 23 09:53:10 compute-1 ceph-mon[80126]: 12.1f scrub ok
Jan 23 09:53:10 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1b deep-scrub starts
Jan 23 09:53:10 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1b deep-scrub ok
Jan 23 09:53:10 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e78 e78: 3 total, 3 up, 3 in
Jan 23 09:53:10 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:10 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:10 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:10 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.16( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:10 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.6( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:10 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.1e( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:10 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.6( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:10 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 78 pg[10.16( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=78) [0]/[1] r=-1 lpr=78 pi=[67,78)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:10 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:11 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.16 scrub starts
Jan 23 09:53:11 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.16 scrub ok
Jan 23 09:53:11 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e79 e79: 3 total, 3 up, 3 in
Jan 23 09:53:11 compute-1 ceph-mon[80126]: 11.d scrub starts
Jan 23 09:53:11 compute-1 ceph-mon[80126]: 11.d scrub ok
Jan 23 09:53:11 compute-1 ceph-mon[80126]: 12.1b deep-scrub starts
Jan 23 09:53:11 compute-1 ceph-mon[80126]: 12.1b deep-scrub ok
Jan 23 09:53:11 compute-1 ceph-mon[80126]: osdmap e78: 3 total, 3 up, 3 in
Jan 23 09:53:11 compute-1 ceph-mon[80126]: pgmap v96: 353 pgs: 4 active+remapped, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 1 op/s; 148 B/s, 6 objects/s recovering
Jan 23 09:53:11 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]: dispatch
Jan 23 09:53:11 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 79 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79) [0] r=0 lpr=79 pi=[66,79)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:11 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 79 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=79) [0] r=0 lpr=79 pi=[67,79)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:11 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 79 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79) [0] r=0 lpr=79 pi=[66,79)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:11 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 79 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=79) [0] r=0 lpr=79 pi=[66,79)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:12 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:12 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.14 scrub starts
Jan 23 09:53:12 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.14 scrub ok
Jan 23 09:53:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e79 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:12 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e4001bd0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:12 compute-1 ceph-mon[80126]: 11.2 scrub starts
Jan 23 09:53:12 compute-1 ceph-mon[80126]: 11.2 scrub ok
Jan 23 09:53:12 compute-1 ceph-mon[80126]: 12.16 scrub starts
Jan 23 09:53:12 compute-1 ceph-mon[80126]: 12.16 scrub ok
Jan 23 09:53:12 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "8"}]': finished
Jan 23 09:53:12 compute-1 ceph-mon[80126]: osdmap e79: 3 total, 3 up, 3 in
Jan 23 09:53:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e80 e80: 3 total, 3 up, 3 in
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 luod=0'0 crt=58'754 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.17( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=0/0 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 luod=0'0 crt=62'763 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=0/0 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'763 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=0/0 n=5 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 luod=0'0 crt=62'763 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=0/0 n=5 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'763 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[67,80)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[67,80)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:12 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 80 pg[10.7( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=80) [0]/[2] r=-1 lpr=80 pi=[66,80)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:13 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1 deep-scrub starts
Jan 23 09:53:13 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 12.1 deep-scrub ok
Jan 23 09:53:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:14 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8001c90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:14 compute-1 ceph-mon[80126]: 11.6 scrub starts
Jan 23 09:53:14 compute-1 ceph-mon[80126]: 11.6 scrub ok
Jan 23 09:53:14 compute-1 ceph-mon[80126]: 10.1c scrub starts
Jan 23 09:53:14 compute-1 ceph-mon[80126]: 10.1c scrub ok
Jan 23 09:53:14 compute-1 ceph-mon[80126]: 12.14 scrub starts
Jan 23 09:53:14 compute-1 ceph-mon[80126]: 12.14 scrub ok
Jan 23 09:53:14 compute-1 ceph-mon[80126]: osdmap e80: 3 total, 3 up, 3 in
Jan 23 09:53:14 compute-1 ceph-mon[80126]: pgmap v99: 353 pgs: 4 active+remapped, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 0 op/s; 148 B/s, 6 objects/s recovering
Jan 23 09:53:14 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]: dispatch
Jan 23 09:53:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e81 e81: 3 total, 3 up, 3 in
Jan 23 09:53:14 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=81 pruub=15.164137840s) [1] r=-1 lpr=81 pi=[59,81)/1 crt=62'761 lcod 62'760 mlcod 62'760 active pruub 217.580154419s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:14 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=81 pruub=15.163958549s) [1] r=-1 lpr=81 pi=[59,81)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 217.580169678s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:14 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=81 pruub=15.163928986s) [1] r=-1 lpr=81 pi=[59,81)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 217.580169678s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:14 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=81 pruub=15.163748741s) [1] r=-1 lpr=81 pi=[59,81)/1 crt=62'761 lcod 62'760 mlcod 0'0 unknown NOTIFY pruub 217.580154419s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:14 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.6( v 62'763 (0'0,62'763] local-lis/les=80/81 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'763 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:14 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'763 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:14 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.e( v 62'764 (0'0,62'764] local-lis/les=80/81 n=6 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:14 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 81 pg[10.16( v 58'754 (0'0,58'754] local-lis/les=80/81 n=4 ec=59/46 lis/c=78/67 les/c/f=79/68/0 sis=80) [0] r=0 lpr=80 pi=[67,80)/1 crt=58'754 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:14 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:14 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1a scrub starts
Jan 23 09:53:14 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1a scrub ok
Jan 23 09:53:14 compute-1 sudo[85350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:53:14 compute-1 sudo[85350]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:53:14 compute-1 sudo[85350]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:14 compute-1 sudo[85375]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/keepalived:2.2.4 --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:53:14 compute-1 sudo[85375]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:53:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:14 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 11.18 scrub starts
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 11.18 scrub ok
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 10.1b scrub starts
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 10.1b scrub ok
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 12.1 deep-scrub starts
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 12.1 deep-scrub ok
Jan 23 09:53:15 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "9"}]': finished
Jan 23 09:53:15 compute-1 ceph-mon[80126]: osdmap e81: 3 total, 3 up, 3 in
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 11.1f scrub starts
Jan 23 09:53:15 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 11.1f scrub ok
Jan 23 09:53:15 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 10.19 scrub starts
Jan 23 09:53:15 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 10.19 scrub ok
Jan 23 09:53:15 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 11.1a scrub starts
Jan 23 09:53:15 compute-1 ceph-mon[80126]: 11.1a scrub ok
Jan 23 09:53:15 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:15 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e82 e82: 3 total, 3 up, 3 in
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=0/0 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=0/0 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] r=0 lpr=82 pi=[59,82)/1 crt=62'761 lcod 62'760 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=59/60 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] r=0 lpr=82 pi=[59,82)/1 crt=62'761 lcod 62'760 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] r=0 lpr=82 pi=[59,82)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=59/60 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] r=0 lpr=82 pi=[59,82)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82) [0] r=0 lpr=82 pi=[67,82)/1 luod=0'0 crt=62'761 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82) [0] r=0 lpr=82 pi=[67,82)/1 crt=62'761 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:15 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 82 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:15 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1e scrub starts
Jan 23 09:53:15 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1e scrub ok
Jan 23 09:53:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:16 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:16 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e83 e83: 3 total, 3 up, 3 in
Jan 23 09:53:16 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.7( v 62'759 (0'0,62'759] local-lis/les=82/83 n=5 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:16 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:16 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=80/67 les/c/f=81/68/0 sis=82) [0] r=0 lpr=82 pi=[67,82)/1 crt=62'761 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:16 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.17( v 62'759 (0'0,62'759] local-lis/les=82/83 n=3 ec=59/46 lis/c=80/66 les/c/f=81/67/0 sis=82) [0] r=0 lpr=82 pi=[66,82)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:16 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 09:53:16 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:53:16 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:53:16 compute-1 ceph-mon[80126]: Deploying daemon keepalived.nfs.cephfs.compute-1.vcrquf on compute-1
Jan 23 09:53:16 compute-1 ceph-mon[80126]: pgmap v101: 353 pgs: 4 peering, 349 active+clean; 455 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 3.2 KiB/s rd, 3 op/s; 50 B/s, 4 objects/s recovering
Jan 23 09:53:16 compute-1 ceph-mon[80126]: osdmap e82: 3 total, 3 up, 3 in
Jan 23 09:53:16 compute-1 ceph-mon[80126]: 11.10 scrub starts
Jan 23 09:53:16 compute-1 ceph-mon[80126]: 11.10 scrub ok
Jan 23 09:53:16 compute-1 ceph-mon[80126]: 8.d scrub starts
Jan 23 09:53:16 compute-1 ceph-mon[80126]: 8.d scrub ok
Jan 23 09:53:16 compute-1 ceph-mon[80126]: 11.1e scrub starts
Jan 23 09:53:16 compute-1 ceph-mon[80126]: 11.1e scrub ok
Jan 23 09:53:16 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=82/83 n=4 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] async=[1] r=0 lpr=82 pi=[59,82)/1 crt=58'754 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=4}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:16 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 83 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=82/83 n=6 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=82) [1]/[0] async=[1] r=0 lpr=82 pi=[59,82)/1 crt=62'761 lcod 62'760 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:16 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:16 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1c scrub starts
Jan 23 09:53:16 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1c scrub ok
Jan 23 09:53:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:16 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c80016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:17 compute-1 ceph-mon[80126]: osdmap e83: 3 total, 3 up, 3 in
Jan 23 09:53:17 compute-1 ceph-mon[80126]: 8.a scrub starts
Jan 23 09:53:17 compute-1 ceph-mon[80126]: 8.a scrub ok
Jan 23 09:53:17 compute-1 ceph-mon[80126]: 11.11 scrub starts
Jan 23 09:53:17 compute-1 ceph-mon[80126]: 11.11 scrub ok
Jan 23 09:53:17 compute-1 ceph-mon[80126]: 11.1c scrub starts
Jan 23 09:53:17 compute-1 ceph-mon[80126]: 11.1c scrub ok
Jan 23 09:53:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e84 e84: 3 total, 3 up, 3 in
Jan 23 09:53:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 84 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=82/83 n=6 ec=59/46 lis/c=82/59 les/c/f=83/60/0 sis=84 pruub=14.970046043s) [1] async=[1] r=-1 lpr=84 pi=[59,84)/1 crt=62'761 lcod 62'760 mlcod 62'760 active pruub 220.446502686s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 84 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=82/83 n=4 ec=59/46 lis/c=82/59 les/c/f=83/60/0 sis=84 pruub=14.969717979s) [1] async=[1] r=-1 lpr=84 pi=[59,84)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 220.446441650s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 84 pg[10.18( v 58'754 (0'0,58'754] local-lis/les=82/83 n=4 ec=59/46 lis/c=82/59 les/c/f=83/60/0 sis=84 pruub=14.969678879s) [1] r=-1 lpr=84 pi=[59,84)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 220.446441650s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:17 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 84 pg[10.8( v 62'761 (0'0,62'761] local-lis/les=82/83 n=6 ec=59/46 lis/c=82/59 les/c/f=83/60/0 sis=84 pruub=14.969963074s) [1] r=-1 lpr=84 pi=[59,84)/1 crt=62'761 lcod 62'760 mlcod 0'0 unknown NOTIFY pruub 220.446502686s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:17 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1b scrub starts
Jan 23 09:53:17 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1b scrub ok
Jan 23 09:53:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:17 compute-1 podman[85440]: 2026-01-23 09:53:17.933959038 +0000 UTC m=+2.899000893 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 09:53:17 compute-1 podman[85440]: 2026-01-23 09:53:17.970117897 +0000 UTC m=+2.935159712 container create 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2, version=2.2.4, release=1793, name=keepalived, io.openshift.tags=Ceph keepalived, description=keepalived for Ceph, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph.)
Jan 23 09:53:18 compute-1 systemd[1]: Started libpod-conmon-74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611.scope.
Jan 23 09:53:18 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:53:18 compute-1 podman[85440]: 2026-01-23 09:53:18.071850618 +0000 UTC m=+3.036892513 container init 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, description=keepalived for Ceph, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.28.2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, name=keepalived, com.redhat.component=keepalived-container)
Jan 23 09:53:18 compute-1 podman[85440]: 2026-01-23 09:53:18.084278246 +0000 UTC m=+3.049320091 container start 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, io.buildah.version=1.28.2, distribution-scope=public, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, version=2.2.4, name=keepalived, description=keepalived for Ceph, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, build-date=2023-02-22T09:23:20)
Jan 23 09:53:18 compute-1 podman[85440]: 2026-01-23 09:53:18.088482791 +0000 UTC m=+3.053524696 container attach 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, io.buildah.version=1.28.2, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., release=1793, io.openshift.expose-services=, vcs-type=git, com.redhat.component=keepalived-container, description=keepalived for Ceph, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, version=2.2.4, io.k8s.display-name=Keepalived on RHEL 9, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 23 09:53:18 compute-1 great_franklin[85533]: 0 0
Jan 23 09:53:18 compute-1 systemd[1]: libpod-74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611.scope: Deactivated successfully.
Jan 23 09:53:18 compute-1 podman[85440]: 2026-01-23 09:53:18.095211057 +0000 UTC m=+3.060252922 container died 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, release=1793, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, vendor=Red Hat, Inc., io.buildah.version=1.28.2, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, vcs-type=git, com.redhat.component=keepalived-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Jan 23 09:53:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:18 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:18 compute-1 systemd[1]: var-lib-containers-storage-overlay-7b87cbaa715e1732428b6ea58cc6eec089245e9495a803abc48eaa43aa31956b-merged.mount: Deactivated successfully.
Jan 23 09:53:18 compute-1 podman[85440]: 2026-01-23 09:53:18.14369586 +0000 UTC m=+3.108737675 container remove 74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611 (image=quay.io/ceph/keepalived:2.2.4, name=great_franklin, distribution-scope=public, release=1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vendor=Red Hat, Inc., com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=keepalived, architecture=x86_64, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9)
Jan 23 09:53:18 compute-1 systemd[1]: libpod-conmon-74f0875c4244b1e09bc91344939e131466fe6b3878f2f11ea0ad7043afd91611.scope: Deactivated successfully.
Jan 23 09:53:18 compute-1 systemd[1]: Reloading.
Jan 23 09:53:18 compute-1 systemd-rc-local-generator[85577]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:18 compute-1 systemd-sysv-generator[85581]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:18 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40089d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:18 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1d deep-scrub starts
Jan 23 09:53:18 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1d deep-scrub ok
Jan 23 09:53:18 compute-1 ceph-mon[80126]: pgmap v104: 353 pgs: 4 peering, 349 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Jan 23 09:53:18 compute-1 ceph-mon[80126]: osdmap e84: 3 total, 3 up, 3 in
Jan 23 09:53:18 compute-1 ceph-mon[80126]: 9.7 scrub starts
Jan 23 09:53:18 compute-1 ceph-mon[80126]: 9.7 scrub ok
Jan 23 09:53:18 compute-1 ceph-mon[80126]: 12.10 scrub starts
Jan 23 09:53:18 compute-1 ceph-mon[80126]: 11.1b scrub starts
Jan 23 09:53:18 compute-1 ceph-mon[80126]: 12.10 scrub ok
Jan 23 09:53:18 compute-1 ceph-mon[80126]: 11.1b scrub ok
Jan 23 09:53:18 compute-1 systemd[1]: Reloading.
Jan 23 09:53:18 compute-1 systemd-sysv-generator[85625]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:53:18 compute-1 systemd-rc-local-generator[85622]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:53:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:18 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:18 compute-1 systemd[1]: Starting Ceph keepalived.nfs.cephfs.compute-1.vcrquf for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:53:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e85 e85: 3 total, 3 up, 3 in
Jan 23 09:53:19 compute-1 podman[85675]: 2026-01-23 09:53:19.193468305 +0000 UTC m=+0.075819271 container create 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, name=keepalived, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, description=keepalived for Ceph)
Jan 23 09:53:19 compute-1 podman[85675]: 2026-01-23 09:53:19.16087056 +0000 UTC m=+0.043221576 image pull 4a3a1ff181d97c6dcfa9138ad76eb99fa2c1e840298461d5a7a56133bc05b9a2 quay.io/ceph/keepalived:2.2.4
Jan 23 09:53:19 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09e464e4ed8a605a356fad7fe10c62525e64299965ede90a0c3d729d42259e69/merged/etc/keepalived/keepalived.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:53:19 compute-1 podman[85675]: 2026-01-23 09:53:19.270909147 +0000 UTC m=+0.153260163 container init 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, description=keepalived for Ceph, vendor=Red Hat, Inc., release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=keepalived-container, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.buildah.version=1.28.2, version=2.2.4)
Jan 23 09:53:19 compute-1 podman[85675]: 2026-01-23 09:53:19.280913798 +0000 UTC m=+0.163264764 container start 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, version=2.2.4, com.redhat.component=keepalived-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.display-name=Keepalived on RHEL 9, distribution-scope=public, name=keepalived, description=keepalived for Ceph, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, build-date=2023-02-22T09:23:20, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.openshift.tags=Ceph keepalived, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.28.2)
Jan 23 09:53:19 compute-1 bash[85675]: 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af
Jan 23 09:53:19 compute-1 systemd[1]: Started Ceph keepalived.nfs.cephfs.compute-1.vcrquf for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:53:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Starting Keepalived v2.2.4 (08/21,2021)
Jan 23 09:53:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Running on Linux 5.14.0-661.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Fri Jan 16 09:19:22 UTC 2026 (built for Linux 5.14.0)
Jan 23 09:53:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Command line: '/usr/sbin/keepalived' '-n' '-l' '-f' '/etc/keepalived/keepalived.conf'
Jan 23 09:53:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Configuration file /etc/keepalived/keepalived.conf
Jan 23 09:53:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: NOTICE: setting config option max_auto_priority should result in better keepalived performance
Jan 23 09:53:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Starting VRRP child process, pid=4
Jan 23 09:53:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: Startup complete
Jan 23 09:53:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: (VI_0) Entering BACKUP STATE (init)
Jan 23 09:53:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:19 2026: VRRP_Script(check_backend) succeeded
Jan 23 09:53:19 compute-1 sudo[85375]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:19 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.4 scrub starts
Jan 23 09:53:19 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.4 scrub ok
Jan 23 09:53:19 compute-1 ceph-mon[80126]: 11.a scrub starts
Jan 23 09:53:19 compute-1 ceph-mon[80126]: 11.a scrub ok
Jan 23 09:53:19 compute-1 ceph-mon[80126]: 11.1d deep-scrub starts
Jan 23 09:53:19 compute-1 ceph-mon[80126]: 12.6 scrub starts
Jan 23 09:53:19 compute-1 ceph-mon[80126]: 11.1d deep-scrub ok
Jan 23 09:53:19 compute-1 ceph-mon[80126]: 12.6 scrub ok
Jan 23 09:53:19 compute-1 ceph-mon[80126]: osdmap e85: 3 total, 3 up, 3 in
Jan 23 09:53:19 compute-1 ceph-mon[80126]: pgmap v107: 353 pgs: 353 active+clean; 456 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 109 B/s, 3 objects/s recovering
Jan 23 09:53:19 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]: dispatch
Jan 23 09:53:19 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:19 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:19 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:19 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:53:19 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 09:53:19 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:53:19 compute-1 ceph-mon[80126]: Deploying daemon keepalived.nfs.cephfs.compute-0.lrsdkc on compute-0
Jan 23 09:53:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e86 e86: 3 total, 3 up, 3 in
Jan 23 09:53:19 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 86 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=86) [0] r=0 lpr=86 pi=[66,86)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:19 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 86 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=86) [0] r=0 lpr=86 pi=[67,86)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:20 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:20 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.7 scrub starts
Jan 23 09:53:20 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.7 scrub ok
Jan 23 09:53:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:20 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:20 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:21 compute-1 ceph-mon[80126]: 11.4 scrub starts
Jan 23 09:53:21 compute-1 ceph-mon[80126]: 11.4 scrub ok
Jan 23 09:53:21 compute-1 ceph-mon[80126]: 11.16 scrub starts
Jan 23 09:53:21 compute-1 ceph-mon[80126]: 11.16 scrub ok
Jan 23 09:53:21 compute-1 ceph-mon[80126]: 12.c deep-scrub starts
Jan 23 09:53:21 compute-1 ceph-mon[80126]: 12.c deep-scrub ok
Jan 23 09:53:21 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "10"}]': finished
Jan 23 09:53:21 compute-1 ceph-mon[80126]: osdmap e86: 3 total, 3 up, 3 in
Jan 23 09:53:21 compute-1 ceph-mon[80126]: 11.7 scrub starts
Jan 23 09:53:21 compute-1 ceph-mon[80126]: 11.7 scrub ok
Jan 23 09:53:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e87 e87: 3 total, 3 up, 3 in
Jan 23 09:53:21 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] r=-1 lpr=87 pi=[66,87)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:21 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.9( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=87) [0]/[2] r=-1 lpr=87 pi=[66,87)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:21 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] r=-1 lpr=87 pi=[67,87)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:21 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.19( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0]/[2] r=-1 lpr=87 pi=[67,87)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:21 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.f deep-scrub starts
Jan 23 09:53:21 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.f deep-scrub ok
Jan 23 09:53:21 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0] r=0 lpr=87 pi=[67,87)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:21 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 87 pg[10.1a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=87) [0] r=0 lpr=87 pi=[67,87)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:22 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 12.12 scrub starts
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 11.8 scrub starts
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 12.12 scrub ok
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 11.8 scrub ok
Jan 23 09:53:22 compute-1 ceph-mon[80126]: pgmap v109: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 6 objects/s recovering
Jan 23 09:53:22 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]: dispatch
Jan 23 09:53:22 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "11"}]': finished
Jan 23 09:53:22 compute-1 ceph-mon[80126]: osdmap e87: 3 total, 3 up, 3 in
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 11.f deep-scrub starts
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 11.f deep-scrub ok
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 12.b scrub starts
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 12.b scrub ok
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 11.13 scrub starts
Jan 23 09:53:22 compute-1 ceph-mon[80126]: 11.13 scrub ok
Jan 23 09:53:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e88 e88: 3 total, 3 up, 3 in
Jan 23 09:53:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 88 pg[10.1a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=88) [0]/[1] r=-1 lpr=88 pi=[67,88)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 88 pg[10.a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=88) [0]/[1] r=-1 lpr=88 pi=[67,88)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 88 pg[10.a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=88) [0]/[1] r=-1 lpr=88 pi=[67,88)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:22 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 88 pg[10.1a( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=67/67 les/c/f=68/68/0 sis=88) [0]/[1] r=-1 lpr=88 pi=[67,88)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:22 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:22 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:22 2026: (VI_0) Entering MASTER STATE
Jan 23 09:53:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e89 e89: 3 total, 3 up, 3 in
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 luod=0'0 crt=58'754 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=0/0 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 crt=58'754 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89) [0] r=0 lpr=89 pi=[67,89)/1 luod=0'0 crt=62'771 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=0/0 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89) [0] r=0 lpr=89 pi=[67,89)/1 crt=62'771 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 89 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:23 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.16 scrub starts
Jan 23 09:53:23 compute-1 ceph-mon[80126]: osdmap e88: 3 total, 3 up, 3 in
Jan 23 09:53:23 compute-1 ceph-mon[80126]: 11.17 scrub starts
Jan 23 09:53:23 compute-1 ceph-mon[80126]: 11.17 scrub ok
Jan 23 09:53:23 compute-1 ceph-mon[80126]: 12.e scrub starts
Jan 23 09:53:23 compute-1 ceph-mon[80126]: 12.e scrub ok
Jan 23 09:53:23 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]: dispatch
Jan 23 09:53:23 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.16 scrub ok
Jan 23 09:53:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e90 e90: 3 total, 3 up, 3 in
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=0/0 n=4 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 luod=0'0 crt=61'756 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=0/0 n=7 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 luod=0'0 crt=62'773 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=0/0 n=4 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 crt=61'756 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=0/0 n=7 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 crt=62'773 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=-1 lpr=90 pi=[66,90)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=-1 lpr=90 pi=[66,90)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=-1 lpr=90 pi=[66,90)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.1b( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=66/66 les/c/f=67/67/0 sis=90) [0]/[2] r=-1 lpr=90 pi=[66,90)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.9( v 58'754 (0'0,58'754] local-lis/les=89/90 n=4 ec=59/46 lis/c=87/66 les/c/f=88/67/0 sis=89) [0] r=0 lpr=89 pi=[66,89)/1 crt=58'754 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:23 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 90 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=87/67 les/c/f=88/68/0 sis=89) [0] r=0 lpr=89 pi=[67,89)/1 crt=62'771 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:24 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:24 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.0 scrub starts
Jan 23 09:53:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:24 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:24 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.0 scrub ok
Jan 23 09:53:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:24 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:25 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.f scrub starts
Jan 23 09:53:25 compute-1 ceph-mon[80126]: pgmap v112: 353 pgs: 353 active+clean; 457 KiB data, 125 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 7 objects/s recovering
Jan 23 09:53:25 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "12"}]': finished
Jan 23 09:53:25 compute-1 ceph-mon[80126]: osdmap e89: 3 total, 3 up, 3 in
Jan 23 09:53:25 compute-1 ceph-mon[80126]: 10.16 scrub starts
Jan 23 09:53:25 compute-1 ceph-mon[80126]: 10.16 scrub ok
Jan 23 09:53:25 compute-1 ceph-mon[80126]: 12.1d scrub starts
Jan 23 09:53:25 compute-1 ceph-mon[80126]: 12.1d scrub ok
Jan 23 09:53:25 compute-1 ceph-mon[80126]: osdmap e90: 3 total, 3 up, 3 in
Jan 23 09:53:25 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.f scrub ok
Jan 23 09:53:25 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e91 e91: 3 total, 3 up, 3 in
Jan 23 09:53:25 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 91 pg[10.a( v 62'773 (0'0,62'773] local-lis/les=90/91 n=7 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 crt=62'773 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:25 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 91 pg[10.1a( v 61'756 (0'0,61'756] local-lis/les=90/91 n=4 ec=59/46 lis/c=88/67 les/c/f=89/68/0 sis=90) [0] r=0 lpr=90 pi=[67,90)/1 crt=61'756 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:26 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:26 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.e scrub starts
Jan 23 09:53:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:26 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e40096e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:26 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.e scrub ok
Jan 23 09:53:26 compute-1 ceph-mon[80126]: 10.0 scrub starts
Jan 23 09:53:26 compute-1 ceph-mon[80126]: 10.0 scrub ok
Jan 23 09:53:26 compute-1 ceph-mon[80126]: 12.1e scrub starts
Jan 23 09:53:26 compute-1 ceph-mon[80126]: 12.1e scrub ok
Jan 23 09:53:26 compute-1 ceph-mon[80126]: pgmap v115: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 82 B/s, 4 objects/s recovering
Jan 23 09:53:26 compute-1 ceph-mon[80126]: 10.f scrub starts
Jan 23 09:53:26 compute-1 ceph-mon[80126]: 10.f scrub ok
Jan 23 09:53:26 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:26 compute-1 ceph-mon[80126]: osdmap e91: 3 total, 3 up, 3 in
Jan 23 09:53:26 compute-1 ceph-mon[80126]: 12.2 scrub starts
Jan 23 09:53:26 compute-1 ceph-mon[80126]: 12.2 scrub ok
Jan 23 09:53:26 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:26 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:26 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:26 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e92 e92: 3 total, 3 up, 3 in
Jan 23 09:53:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 92 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 luod=0'0 crt=62'759 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 92 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=0/0 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 crt=62'759 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 92 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=0/0 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 luod=0'0 crt=60'756 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:27 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 92 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=0/0 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 crt=60'756 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:27 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.6 deep-scrub starts
Jan 23 09:53:27 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.6 deep-scrub ok
Jan 23 09:53:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e92 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:28 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:28 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1 scrub starts
Jan 23 09:53:28 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.1 scrub ok
Jan 23 09:53:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:28 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:28 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e400a3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:29 2026: (VI_0) Master received advert from 192.168.122.100 with higher priority 100, ours 90
Jan 23 09:53:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf[85690]: Fri Jan 23 09:53:29 2026: (VI_0) Entering BACKUP STATE
Jan 23 09:53:29 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.12 deep-scrub starts
Jan 23 09:53:29 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.12 deep-scrub ok
Jan 23 09:53:29 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:53:29 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:53:29 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-1 interface br-ex
Jan 23 09:53:29 compute-1 ceph-mon[80126]: Deploying daemon keepalived.nfs.cephfs.compute-2.pawaai on compute-2
Jan 23 09:53:29 compute-1 ceph-mon[80126]: 10.e scrub starts
Jan 23 09:53:29 compute-1 ceph-mon[80126]: 10.e scrub ok
Jan 23 09:53:29 compute-1 ceph-mon[80126]: 12.3 scrub starts
Jan 23 09:53:29 compute-1 ceph-mon[80126]: 12.3 scrub ok
Jan 23 09:53:29 compute-1 ceph-mon[80126]: osdmap e92: 3 total, 3 up, 3 in
Jan 23 09:53:29 compute-1 ceph-mon[80126]: pgmap v118: 353 pgs: 1 active+recovering+remapped, 1 active+remapped, 2 peering, 349 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 2/221 objects misplaced (0.905%); 137 B/s, 5 objects/s recovering
Jan 23 09:53:29 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e93 e93: 3 total, 3 up, 3 in
Jan 23 09:53:29 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 93 pg[10.b( v 62'759 (0'0,62'759] local-lis/les=92/93 n=5 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 crt=62'759 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:29 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 93 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=90/66 les/c/f=91/67/0 sis=92) [0] r=0 lpr=92 pi=[66,92)/1 crt=60'756 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:30 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e400a3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:30 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.5 scrub starts
Jan 23 09:53:30 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.5 scrub ok
Jan 23 09:53:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:30 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:30 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 10.6 deep-scrub starts
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 10.6 deep-scrub ok
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 12.1a scrub starts
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 12.1a scrub ok
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 11.1 scrub starts
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 11.1 scrub ok
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 12.9 scrub starts
Jan 23 09:53:30 compute-1 ceph-mon[80126]: pgmap v119: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 19 B/s, 0 objects/s recovering
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 11.12 deep-scrub starts
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 11.12 deep-scrub ok
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 12.9 scrub ok
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 12.4 scrub starts
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 12.4 scrub ok
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 12.a scrub starts
Jan 23 09:53:30 compute-1 ceph-mon[80126]: 12.a scrub ok
Jan 23 09:53:30 compute-1 ceph-mon[80126]: osdmap e93: 3 total, 3 up, 3 in
Jan 23 09:53:31 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.14 deep-scrub starts
Jan 23 09:53:31 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 11.14 deep-scrub ok
Jan 23 09:53:31 compute-1 ceph-mon[80126]: 11.5 scrub starts
Jan 23 09:53:31 compute-1 ceph-mon[80126]: 11.5 scrub ok
Jan 23 09:53:31 compute-1 ceph-mon[80126]: 12.7 scrub starts
Jan 23 09:53:31 compute-1 ceph-mon[80126]: 12.7 scrub ok
Jan 23 09:53:31 compute-1 ceph-mon[80126]: 12.8 deep-scrub starts
Jan 23 09:53:31 compute-1 ceph-mon[80126]: 12.8 deep-scrub ok
Jan 23 09:53:31 compute-1 ceph-mon[80126]: pgmap v121: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:31 compute-1 ceph-mon[80126]: 11.14 deep-scrub starts
Jan 23 09:53:31 compute-1 ceph-mon[80126]: 11.14 deep-scrub ok
Jan 23 09:53:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:32 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:32 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1e scrub starts
Jan 23 09:53:32 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1e scrub ok
Jan 23 09:53:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:32 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e400a3f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:32 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:32 compute-1 ceph-mon[80126]: 11.19 deep-scrub starts
Jan 23 09:53:32 compute-1 ceph-mon[80126]: 11.19 deep-scrub ok
Jan 23 09:53:32 compute-1 ceph-mon[80126]: 12.1c scrub starts
Jan 23 09:53:32 compute-1 ceph-mon[80126]: 12.1c scrub ok
Jan 23 09:53:32 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:32 compute-1 ceph-mon[80126]: 10.1e scrub starts
Jan 23 09:53:32 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:32 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:32 compute-1 ceph-mon[80126]: 10.1e scrub ok
Jan 23 09:53:32 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:32 compute-1 ceph-mon[80126]: Deploying daemon alertmanager.compute-0 on compute-0
Jan 23 09:53:33 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1f scrub starts
Jan 23 09:53:33 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1f scrub ok
Jan 23 09:53:34 compute-1 ceph-mon[80126]: 12.17 scrub starts
Jan 23 09:53:34 compute-1 ceph-mon[80126]: 12.17 scrub ok
Jan 23 09:53:34 compute-1 ceph-mon[80126]: 12.19 scrub starts
Jan 23 09:53:34 compute-1 ceph-mon[80126]: 12.19 scrub ok
Jan 23 09:53:34 compute-1 ceph-mon[80126]: pgmap v122: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:34 compute-1 ceph-mon[80126]: 10.1f scrub starts
Jan 23 09:53:34 compute-1 ceph-mon[80126]: 10.1f scrub ok
Jan 23 09:53:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:34 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9bc003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:34 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.10 scrub starts
Jan 23 09:53:34 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.10 scrub ok
Jan 23 09:53:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:34 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:34 compute-1 sshd-session[85707]: Accepted publickey for zuul from 192.168.122.30 port 53858 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:53:34 compute-1 systemd-logind[807]: New session 36 of user zuul.
Jan 23 09:53:34 compute-1 systemd[1]: Started Session 36 of User zuul.
Jan 23 09:53:34 compute-1 sshd-session[85707]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:53:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:34 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8002060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:35 compute-1 ceph-mon[80126]: 12.11 scrub starts
Jan 23 09:53:35 compute-1 ceph-mon[80126]: 12.11 scrub ok
Jan 23 09:53:35 compute-1 ceph-mon[80126]: 10.15 scrub starts
Jan 23 09:53:35 compute-1 ceph-mon[80126]: 10.15 scrub ok
Jan 23 09:53:35 compute-1 ceph-mon[80126]: 10.10 scrub starts
Jan 23 09:53:35 compute-1 ceph-mon[80126]: 10.10 scrub ok
Jan 23 09:53:35 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.9 scrub starts
Jan 23 09:53:35 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.9 scrub ok
Jan 23 09:53:35 compute-1 python3.9[85861]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:53:36 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e94 e94: 3 total, 3 up, 3 in
Jan 23 09:53:36 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 94 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94) [0] r=0 lpr=94 pi=[73,94)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:36 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 94 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=94) [0] r=0 lpr=94 pi=[73,94)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:36 compute-1 ceph-mon[80126]: 12.13 scrub starts
Jan 23 09:53:36 compute-1 ceph-mon[80126]: 12.13 scrub ok
Jan 23 09:53:36 compute-1 ceph-mon[80126]: 10.12 scrub starts
Jan 23 09:53:36 compute-1 ceph-mon[80126]: 10.12 scrub ok
Jan 23 09:53:36 compute-1 ceph-mon[80126]: pgmap v123: 353 pgs: 353 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]: dispatch
Jan 23 09:53:36 compute-1 ceph-mon[80126]: 10.9 scrub starts
Jan 23 09:53:36 compute-1 ceph-mon[80126]: 10.9 scrub ok
Jan 23 09:53:36 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:36 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:36 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.a scrub starts
Jan 23 09:53:36 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.a scrub ok
Jan 23 09:53:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:36 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc000f90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:36 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:37 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.b deep-scrub starts
Jan 23 09:53:37 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.b deep-scrub ok
Jan 23 09:53:37 compute-1 sudo[86075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zstwlhgnxyqtrlyarcwgrczyglckbtix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162016.9512532-52-132754282481525/AnsiballZ_command.py'
Jan 23 09:53:37 compute-1 sudo[86075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:53:37 compute-1 python3.9[86077]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail
                                            pushd /var/tmp
                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                            pushd repo-setup-main
                                            python3 -m venv ./venv
                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./
                                            ./venv/bin/repo-setup current-podified -b antelope
                                            popd
                                            rm -rf repo-setup-main
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:53:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:38 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8002060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:38 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1a scrub starts
Jan 23 09:53:38 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.1a scrub ok
Jan 23 09:53:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:38 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d80029a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e95 e95: 3 total, 3 up, 3 in
Jan 23 09:53:38 compute-1 ceph-mon[80126]: 12.18 scrub starts
Jan 23 09:53:38 compute-1 ceph-mon[80126]: 10.d scrub starts
Jan 23 09:53:38 compute-1 ceph-mon[80126]: 12.18 scrub ok
Jan 23 09:53:38 compute-1 ceph-mon[80126]: 10.d scrub ok
Jan 23 09:53:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "13"}]': finished
Jan 23 09:53:38 compute-1 ceph-mon[80126]: osdmap e94: 3 total, 3 up, 3 in
Jan 23 09:53:38 compute-1 ceph-mon[80126]: 10.a scrub starts
Jan 23 09:53:38 compute-1 ceph-mon[80126]: 10.a scrub ok
Jan 23 09:53:38 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:38 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 95 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=-1 lpr=95 pi=[73,95)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:38 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 95 pg[10.c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=-1 lpr=95 pi=[73,95)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:38 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 95 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=-1 lpr=95 pi=[73,95)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:38 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 95 pg[10.1c( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=73/73 les/c/f=74/74/0 sis=95) [0]/[2] r=-1 lpr=95 pi=[73,95)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:38 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc001ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 11.3 scrub starts
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 11.3 scrub ok
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.8 scrub starts
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.8 scrub ok
Jan 23 09:53:39 compute-1 ceph-mon[80126]: pgmap v125: 353 pgs: 353 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.b deep-scrub starts
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.b deep-scrub ok
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 11.e scrub starts
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 11.e scrub ok
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.2 scrub starts
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.2 scrub ok
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.1a scrub starts
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.1a scrub ok
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.18 scrub starts
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.4 deep-scrub starts
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.18 scrub ok
Jan 23 09:53:39 compute-1 ceph-mon[80126]: 10.4 deep-scrub ok
Jan 23 09:53:39 compute-1 ceph-mon[80126]: osdmap e95: 3 total, 3 up, 3 in
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-1 ceph-mon[80126]: Regenerating cephadm self-signed grafana TLS certificates
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:39 compute-1 ceph-mon[80126]: Deploying daemon grafana.compute-0 on compute-0
Jan 23 09:53:39 compute-1 ceph-mon[80126]: pgmap v127: 353 pgs: 353 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:39 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]: dispatch
Jan 23 09:53:39 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e96 e96: 3 total, 3 up, 3 in
Jan 23 09:53:39 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 96 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=96) [0] r=0 lpr=96 pi=[78,96)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:39 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 96 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=96) [0] r=0 lpr=96 pi=[78,96)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:40 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:40 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8002060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:40 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e97 e97: 3 total, 3 up, 3 in
Jan 23 09:53:40 compute-1 ceph-mon[80126]: 10.1 scrub starts
Jan 23 09:53:40 compute-1 ceph-mon[80126]: 10.1d scrub starts
Jan 23 09:53:40 compute-1 ceph-mon[80126]: 10.1d scrub ok
Jan 23 09:53:40 compute-1 ceph-mon[80126]: 10.1 scrub ok
Jan 23 09:53:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 09:53:40 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "14"}]': finished
Jan 23 09:53:40 compute-1 ceph-mon[80126]: osdmap e96: 3 total, 3 up, 3 in
Jan 23 09:53:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=97) [0]/[1] r=-1 lpr=97 pi=[78,97)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=97) [0]/[1] r=-1 lpr=97 pi=[78,97)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.1d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=97) [0]/[1] r=-1 lpr=97 pi=[78,97)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.d( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=78/78 les/c/f=79/79/0 sis=97) [0]/[1] r=-1 lpr=97 pi=[78,97)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 97 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=0/0 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e98 e98: 3 total, 3 up, 3 in
Jan 23 09:53:42 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 98 pg[10.c( v 62'764 (0'0,62'764] local-lis/les=97/98 n=6 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:42 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 98 pg[10.1c( v 62'764 (0'0,62'764] local-lis/les=97/98 n=7 ec=59/46 lis/c=95/73 les/c/f=96/74/0 sis=97) [0] r=0 lpr=97 pi=[73,97)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:42 compute-1 ceph-mon[80126]: 10.13 scrub starts
Jan 23 09:53:42 compute-1 ceph-mon[80126]: 10.13 scrub ok
Jan 23 09:53:42 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:42 compute-1 ceph-mon[80126]: osdmap e97: 3 total, 3 up, 3 in
Jan 23 09:53:42 compute-1 ceph-mon[80126]: pgmap v130: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:42 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]: dispatch
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.068169) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022068451, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 7456, "num_deletes": 256, "total_data_size": 18505937, "memory_usage": 19292528, "flush_reason": "Manual Compaction"}
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Jan 23 09:53:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:42 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc001ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022162814, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 11454816, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 251, "largest_seqno": 7461, "table_properties": {"data_size": 11424585, "index_size": 19300, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9861, "raw_key_size": 96698, "raw_average_key_size": 24, "raw_value_size": 11348944, "raw_average_value_size": 2880, "num_data_blocks": 851, "num_entries": 3940, "num_filter_entries": 3940, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 1769161847, "file_creation_time": 1769162022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 94624 microseconds, and 26163 cpu microseconds.
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.162890) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 11454816 bytes OK
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.162931) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.164474) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.164520) EVENT_LOG_v1 {"time_micros": 1769162022164515, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.164543) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 18464253, prev total WAL file size 18464253, number of live WAL files 2.
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.168587) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730030' seq:72057594037927935, type:22 .. '7061786F7300323532' seq:0, type:0; will stop at (end)
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(10MB) 8(1773B)]
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022168742, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 11456589, "oldest_snapshot_seqno": -1}
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 3688 keys, 11451458 bytes, temperature: kUnknown
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022238725, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 11451458, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11421854, "index_size": 19254, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 92454, "raw_average_key_size": 25, "raw_value_size": 11349349, "raw_average_value_size": 3077, "num_data_blocks": 850, "num_entries": 3688, "num_filter_entries": 3688, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162022, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.239082) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 11451458 bytes
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.241126) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 163.5 rd, 163.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(10.9, 0.0 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 3945, records dropped: 257 output_compression: NoCompression
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.241162) EVENT_LOG_v1 {"time_micros": 1769162022241146, "job": 4, "event": "compaction_finished", "compaction_time_micros": 70091, "compaction_time_cpu_micros": 27297, "output_level": 6, "num_output_files": 1, "total_output_size": 11451458, "num_input_records": 3945, "num_output_records": 3688, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022244782, "job": 4, "event": "table_file_deletion", "file_number": 14}
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162022245029, "job": 4, "event": "table_file_deletion", "file_number": 8}
Jan 23 09:53:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:53:42.168407) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:53:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:42 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:42 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8002060 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:43 compute-1 ceph-mon[80126]: 10.11 scrub starts
Jan 23 09:53:43 compute-1 ceph-mon[80126]: 10.11 scrub ok
Jan 23 09:53:43 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "15"}]': finished
Jan 23 09:53:43 compute-1 ceph-mon[80126]: osdmap e98: 3 total, 3 up, 3 in
Jan 23 09:53:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e99 e99: 3 total, 3 up, 3 in
Jan 23 09:53:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 99 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=0/0 n=8 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 luod=0'0 crt=62'768 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 99 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=0/0 n=8 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 crt=62'768 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 99 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=0/0 n=5 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 luod=0'0 crt=62'764 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [1] -> [0], acting_primary 1 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 99 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=0/0 n=5 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 crt=62'764 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:44 compute-1 ceph-mon[80126]: 10.3 scrub starts
Jan 23 09:53:44 compute-1 ceph-mon[80126]: 10.3 scrub ok
Jan 23 09:53:44 compute-1 ceph-mon[80126]: pgmap v132: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 126 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:44 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]: dispatch
Jan 23 09:53:44 compute-1 ceph-mon[80126]: osdmap e99: 3 total, 3 up, 3 in
Jan 23 09:53:44 compute-1 ceph-mon[80126]: 10.14 scrub starts
Jan 23 09:53:44 compute-1 ceph-mon[80126]: 10.14 scrub ok
Jan 23 09:53:44 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e100 e100: 3 total, 3 up, 3 in
Jan 23 09:53:44 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100 pruub=12.086140633s) [2] r=-1 lpr=100 pi=[82,100)/1 crt=62'771 mlcod 0'0 active pruub 244.444183350s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:44 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100 pruub=12.085646629s) [2] r=-1 lpr=100 pi=[82,100)/1 crt=62'761 mlcod 0'0 active pruub 244.444198608s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:44 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100 pruub=12.085577011s) [2] r=-1 lpr=100 pi=[82,100)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 244.444198608s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:44 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=100 pruub=12.085281372s) [2] r=-1 lpr=100 pi=[82,100)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 244.444183350s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:44 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.d( v 62'768 (0'0,62'768] local-lis/les=99/100 n=8 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 crt=62'768 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:44 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 100 pg[10.1d( v 62'764 (0'0,62'764] local-lis/les=99/100 n=5 ec=59/46 lis/c=97/78 les/c/f=98/79/0 sis=99) [0] r=0 lpr=99 pi=[78,99)/1 crt=62'764 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:44 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.c scrub starts
Jan 23 09:53:44 compute-1 ceph-osd[77616]: log_channel(cluster) log [DBG] : 10.c scrub ok
Jan 23 09:53:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc001ab0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:44 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:44 compute-1 sudo[86075]: pam_unix(sudo:session): session closed for user root
Jan 23 09:53:45 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e101 e101: 3 total, 3 up, 3 in
Jan 23 09:53:45 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 101 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=0 lpr=101 pi=[82,101)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:45 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 101 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=82/83 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=0 lpr=101 pi=[82,101)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:45 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 101 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=0 lpr=101 pi=[82,101)/1 crt=62'761 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:45 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 101 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=82/83 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] r=0 lpr=101 pi=[82,101)/1 crt=62'761 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:45 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "16"}]': finished
Jan 23 09:53:45 compute-1 ceph-mon[80126]: osdmap e100: 3 total, 3 up, 3 in
Jan 23 09:53:45 compute-1 ceph-mon[80126]: 10.c scrub starts
Jan 23 09:53:45 compute-1 ceph-mon[80126]: 10.c scrub ok
Jan 23 09:53:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:46 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8003550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:46 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:46 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e102 e102: 3 total, 3 up, 3 in
Jan 23 09:53:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 102 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=101/102 n=5 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] async=[2] r=0 lpr=101 pi=[82,101)/1 crt=62'761 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 102 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=101/102 n=7 ec=59/46 lis/c=82/82 les/c/f=83/83/0 sis=101) [2]/[0] async=[2] r=0 lpr=101 pi=[82,101)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:47 compute-1 ceph-mon[80126]: pgmap v135: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 126 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Jan 23 09:53:47 compute-1 ceph-mon[80126]: osdmap e101: 3 total, 3 up, 3 in
Jan 23 09:53:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:48 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:48 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9e8003550 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:48 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:50 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:52 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9dc002f40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:52 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:52 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9d8003e90 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:53:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e103 e103: 3 total, 3 up, 3 in
Jan 23 09:53:53 compute-1 ceph-mon[80126]: pgmap v137: 353 pgs: 2 peering, 351 active+clean; 457 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:53 compute-1 ceph-mon[80126]: osdmap e102: 3 total, 3 up, 3 in
Jan 23 09:53:53 compute-1 ceph-mon[80126]: pgmap v139: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:53 compute-1 ceph-mon[80126]: pgmap v140: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:53 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 09:53:53 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 09:53:53 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]: dispatch
Jan 23 09:53:53 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=101/102 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103 pruub=9.513637543s) [2] async=[2] r=-1 lpr=103 pi=[82,103)/1 crt=62'771 mlcod 62'771 active pruub 251.297988892s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:53 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.f( v 62'771 (0'0,62'771] local-lis/les=101/102 n=7 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103 pruub=9.513319016s) [2] r=-1 lpr=103 pi=[82,103)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 251.297988892s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:53 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=101/102 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103 pruub=9.508596420s) [2] async=[2] r=-1 lpr=103 pi=[82,103)/1 crt=62'761 mlcod 62'761 active pruub 251.294219971s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:53 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=101/102 n=5 ec=59/46 lis/c=101/82 les/c/f=102/83/0 sis=103 pruub=9.508519173s) [2] r=-1 lpr=103 pi=[82,103)/1 crt=62'761 mlcod 0'0 unknown NOTIFY pruub 251.294219971s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:53 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=103 pruub=15.795506477s) [2] r=-1 lpr=103 pi=[59,103)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 257.581420898s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:53 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 103 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=103 pruub=15.795460701s) [2] r=-1 lpr=103 pi=[59,103)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 257.581420898s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:54 compute-1 kernel: ganesha.nfsd[85705]: segfault at 50 ip 00007faa6d51732e sp 00007fa9e3ffe210 error 4 in libntirpc.so.5.8[7faa6d4fc000+2c000] likely on CPU 3 (core 0, socket 3)
Jan 23 09:53:54 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 09:53:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[84913]: 23/01/2026 09:53:54 : epoch 697344ec : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa9c8003c10 fd 39 proxy ignored for local
Jan 23 09:53:54 compute-1 systemd[1]: Created slice Slice /system/systemd-coredump.
Jan 23 09:53:54 compute-1 systemd[1]: Started Process Core Dump (PID 86143/UID 0).
Jan 23 09:53:54 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e104 e104: 3 total, 3 up, 3 in
Jan 23 09:53:54 compute-1 ceph-mon[80126]: pgmap v141: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:54 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 09:53:54 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 09:53:54 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "17"}]': finished
Jan 23 09:53:54 compute-1 ceph-mon[80126]: osdmap e103: 3 total, 3 up, 3 in
Jan 23 09:53:54 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 104 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] r=0 lpr=104 pi=[59,104)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:54 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 104 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=59/60 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] r=0 lpr=104 pi=[59,104)/1 crt=58'754 lcod 0'0 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:53:55 compute-1 systemd-coredump[86144]: Process 84917 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 55:
                                                   #0  0x00007faa6d51732e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   #1  0x0000000000000000 n/a (n/a + 0x0)
                                                   #2  0x00007faa6d521900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                   ELF object binary architecture: AMD x86-64
Jan 23 09:53:55 compute-1 systemd[1]: systemd-coredump@0-86143-0.service: Deactivated successfully.
Jan 23 09:53:55 compute-1 systemd[1]: systemd-coredump@0-86143-0.service: Consumed 1.231s CPU time.
Jan 23 09:53:55 compute-1 podman[86150]: 2026-01-23 09:53:55.534004367 +0000 UTC m=+0.039553749 container died 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_REF=squid)
Jan 23 09:53:55 compute-1 systemd[1]: var-lib-containers-storage-overlay-c9232a39daee4ec9f2e0ef901a5a2ed006b307c0f9690c1c71a60180bcf88cd3-merged.mount: Deactivated successfully.
Jan 23 09:53:55 compute-1 systemd[82140]: Starting Mark boot as successful...
Jan 23 09:53:55 compute-1 podman[86150]: 2026-01-23 09:53:55.586359054 +0000 UTC m=+0.091908436 container remove 492464d17e3efa8e97d2081b98e45c6a967d44cbf33e90dad6d9fcfb5fce4ec9 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:53:55 compute-1 systemd[82140]: Finished Mark boot as successful.
Jan 23 09:53:55 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 09:53:55 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e105 e105: 3 total, 3 up, 3 in
Jan 23 09:53:55 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 105 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=104/105 n=2 ec=59/46 lis/c=59/59 les/c/f=60/60/0 sis=104) [2]/[0] async=[2] r=0 lpr=104 pi=[59,104)/1 crt=58'754 lcod 0'0 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:53:55 compute-1 ceph-mon[80126]: osdmap e104: 3 total, 3 up, 3 in
Jan 23 09:53:55 compute-1 ceph-mon[80126]: pgmap v144: 353 pgs: 2 active+remapped, 351 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 27 B/s, 0 objects/s recovering
Jan 23 09:53:55 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]: dispatch
Jan 23 09:53:55 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 09:53:55 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.782s CPU time.
Jan 23 09:53:56 compute-1 sshd-session[85710]: Connection closed by 192.168.122.30 port 53858
Jan 23 09:53:56 compute-1 sshd-session[85707]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:53:56 compute-1 systemd[1]: session-36.scope: Deactivated successfully.
Jan 23 09:53:56 compute-1 systemd[1]: session-36.scope: Consumed 8.579s CPU time.
Jan 23 09:53:56 compute-1 systemd-logind[807]: Session 36 logged out. Waiting for processes to exit.
Jan 23 09:53:56 compute-1 systemd-logind[807]: Removed session 36.
Jan 23 09:53:56 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e106 e106: 3 total, 3 up, 3 in
Jan 23 09:53:56 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 106 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=104/105 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106 pruub=14.980967522s) [2] async=[2] r=-1 lpr=106 pi=[59,106)/1 crt=58'754 lcod 0'0 mlcod 0'0 active pruub 259.864685059s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:53:56 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 106 pg[10.10( v 58'754 (0'0,58'754] local-lis/les=104/105 n=2 ec=59/46 lis/c=104/59 les/c/f=105/60/0 sis=106 pruub=14.980821609s) [2] r=-1 lpr=106 pi=[59,106)/1 crt=58'754 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 259.864685059s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:53:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "18"}]': finished
Jan 23 09:53:56 compute-1 ceph-mon[80126]: osdmap e105: 3 total, 3 up, 3 in
Jan 23 09:53:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:56 compute-1 ceph-mon[80126]: Deploying daemon haproxy.rgw.default.compute-0.qabsws on compute-0
Jan 23 09:53:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:53:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e107 e107: 3 total, 3 up, 3 in
Jan 23 09:53:57 compute-1 ceph-mon[80126]: osdmap e106: 3 total, 3 up, 3 in
Jan 23 09:53:57 compute-1 ceph-mon[80126]: pgmap v147: 353 pgs: 353 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:53:57 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]: dispatch
Jan 23 09:53:57 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "19"}]': finished
Jan 23 09:53:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:53:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.005000163s ======
Jan 23 09:53:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:53:58.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.005000163s
Jan 23 09:53:58 compute-1 ceph-mon[80126]: osdmap e107: 3 total, 3 up, 3 in
Jan 23 09:53:58 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:58 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:58 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:53:58 compute-1 ceph-mon[80126]: Deploying daemon haproxy.rgw.default.compute-2.izjwnk on compute-2
Jan 23 09:53:59 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e108 e108: 3 total, 3 up, 3 in
Jan 23 09:53:59 compute-1 ceph-mon[80126]: pgmap v149: 353 pgs: 353 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 459 B/s rd, 0 op/s
Jan 23 09:53:59 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]: dispatch
Jan 23 09:53:59 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "20"}]': finished
Jan 23 09:53:59 compute-1 ceph-mon[80126]: osdmap e108: 3 total, 3 up, 3 in
Jan 23 09:54:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095400 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:54:00 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e109 e109: 3 total, 3 up, 3 in
Jan 23 09:54:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:00.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:00.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:00 compute-1 ceph-mon[80126]: Health check failed: 1 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 23 09:54:00 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:00 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:00 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:00 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:00 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:54:00 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:54:00 compute-1 ceph-mon[80126]: Deploying daemon keepalived.rgw.default.compute-0.tytkrd on compute-0
Jan 23 09:54:00 compute-1 ceph-mon[80126]: osdmap e109: 3 total, 3 up, 3 in
Jan 23 09:54:01 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e110 e110: 3 total, 3 up, 3 in
Jan 23 09:54:01 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:01 compute-1 ceph-mon[80126]: pgmap v152: 353 pgs: 1 remapped+peering, 352 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 467 B/s rd, 0 op/s; 0 B/s, 0 objects/s recovering
Jan 23 09:54:01 compute-1 ceph-mon[80126]: osdmap e110: 3 total, 3 up, 3 in
Jan 23 09:54:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e111 e111: 3 total, 3 up, 3 in
Jan 23 09:54:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 09:54:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:02.497 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 09:54:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:02.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:03 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:03 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:03 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:03 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-2 interface br-ex
Jan 23 09:54:03 compute-1 ceph-mon[80126]: 192.168.122.2 is in 192.168.122.0/24 on compute-0 interface br-ex
Jan 23 09:54:03 compute-1 ceph-mon[80126]: Deploying daemon keepalived.rgw.default.compute-2.qpmsjd on compute-2
Jan 23 09:54:03 compute-1 ceph-mon[80126]: osdmap e111: 3 total, 3 up, 3 in
Jan 23 09:54:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e112 e112: 3 total, 3 up, 3 in
Jan 23 09:54:04 compute-1 ceph-mon[80126]: pgmap v155: 353 pgs: 1 remapped+peering, 352 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 0 B/s, 0 objects/s recovering
Jan 23 09:54:04 compute-1 ceph-mon[80126]: osdmap e112: 3 total, 3 up, 3 in
Jan 23 09:54:04 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:04 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:04 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:04 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:04.501 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:04.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:05 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e113 e113: 3 total, 3 up, 3 in
Jan 23 09:54:05 compute-1 ceph-mon[80126]: Deploying daemon prometheus.compute-0 on compute-0
Jan 23 09:54:05 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]: dispatch
Jan 23 09:54:05 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 1.
Jan 23 09:54:05 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:54:05 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.782s CPU time.
Jan 23 09:54:05 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:54:06 compute-1 podman[86246]: 2026-01-23 09:54:06.164848925 +0000 UTC m=+0.051647994 container create 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 23 09:54:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:06 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:06 compute-1 podman[86246]: 2026-01-23 09:54:06.229886428 +0000 UTC m=+0.116685547 container init 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:54:06 compute-1 podman[86246]: 2026-01-23 09:54:06.139382659 +0000 UTC m=+0.026181778 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:54:06 compute-1 podman[86246]: 2026-01-23 09:54:06.235026879 +0000 UTC m=+0.121825988 container start 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 09:54:06 compute-1 bash[86246]: 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd
Jan 23 09:54:06 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:54:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 09:54:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 09:54:06 compute-1 ceph-mon[80126]: pgmap v157: 353 pgs: 353 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 22 B/s, 1 objects/s recovering
Jan 23 09:54:06 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "21"}]': finished
Jan 23 09:54:06 compute-1 ceph-mon[80126]: osdmap e113: 3 total, 3 up, 3 in
Jan 23 09:54:06 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:06 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e114 e114: 3 total, 3 up, 3 in
Jan 23 09:54:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 09:54:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 09:54:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 09:54:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 09:54:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 09:54:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:54:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:06.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:06.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:07 compute-1 ceph-mon[80126]: osdmap e114: 3 total, 3 up, 3 in
Jan 23 09:54:07 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]: dispatch
Jan 23 09:54:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e115 e115: 3 total, 3 up, 3 in
Jan 23 09:54:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e115 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:08.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:08.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:08 compute-1 ceph-mon[80126]: pgmap v160: 353 pgs: 353 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 44 B/s, 1 objects/s recovering
Jan 23 09:54:08 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "22"}]': finished
Jan 23 09:54:08 compute-1 ceph-mon[80126]: osdmap e115: 3 total, 3 up, 3 in
Jan 23 09:54:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e116 e116: 3 total, 3 up, 3 in
Jan 23 09:54:10 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e117 e117: 3 total, 3 up, 3 in
Jan 23 09:54:10 compute-1 ceph-mon[80126]: osdmap e116: 3 total, 3 up, 3 in
Jan 23 09:54:10 compute-1 ceph-mon[80126]: pgmap v163: 353 pgs: 1 peering, 352 active+clean; 456 KiB data, 144 MiB used, 60 GiB / 60 GiB avail; 1023 B/s rd, 255 B/s wr, 1 op/s; 27 B/s, 2 objects/s recovering
Jan 23 09:54:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:10.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:10.721 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095410 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:54:11 compute-1 ceph-mon[80126]: osdmap e117: 3 total, 3 up, 3 in
Jan 23 09:54:12 compute-1 sshd-session[86307]: Accepted publickey for zuul from 192.168.122.30 port 59612 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:54:12 compute-1 systemd-logind[807]: New session 37 of user zuul.
Jan 23 09:54:12 compute-1 systemd[1]: Started Session 37 of User zuul.
Jan 23 09:54:12 compute-1 sshd-session[86307]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:54:12 compute-1 ceph-mon[80126]: pgmap v165: 353 pgs: 1 peering, 352 active+clean; 456 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 853 B/s rd, 426 B/s wr, 1 op/s; 0 B/s, 1 objects/s recovering
Jan 23 09:54:12 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:12 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:12 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:12 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr handle_mgr_map respawning because set of enabled modules changed!
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  e: '/usr/bin/ceph-mgr'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  0: '/usr/bin/ceph-mgr'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  1: '-n'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  2: 'mgr.compute-1.jmakme'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  3: '-f'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  4: '--setuser'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  5: 'ceph'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  6: '--setgroup'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  7: 'ceph'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  8: '--default-log-to-file=false'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  9: '--default-log-to-journald=true'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  10: '--default-log-to-stderr=false'
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr respawn  exe_path /proc/self/exe
Jan 23 09:54:12 compute-1 sshd-session[82155]: Connection closed by 192.168.122.100 port 38422
Jan 23 09:54:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setuser ceph since I am not root
Jan 23 09:54:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: ignoring --setgroup ceph since I am not root
Jan 23 09:54:12 compute-1 sshd-session[82136]: pam_unix(sshd:session): session closed for user ceph-admin
Jan 23 09:54:12 compute-1 systemd[1]: session-34.scope: Deactivated successfully.
Jan 23 09:54:12 compute-1 systemd[1]: session-34.scope: Consumed 20.705s CPU time.
Jan 23 09:54:12 compute-1 systemd-logind[807]: Session 34 logged out. Waiting for processes to exit.
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: ceph version 19.2.3 (c92aebb279828e9c3c1f5d24613efca272649e62) squid (stable), process ceph-mgr, pid 2
Jan 23 09:54:12 compute-1 systemd-logind[807]: Removed session 34.
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: pidfile_write: ignore empty --pid-file
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'alerts'
Jan 23 09:54:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:54:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:54:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'balancer'
Jan 23 09:54:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:12.487+0000 7f26bfd6e140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Jan 23 09:54:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:12.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:54:12 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'cephadm'
Jan 23 09:54:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:12.576+0000 7f26bfd6e140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Jan 23 09:54:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:12.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:12 compute-1 python3.9[86480]: ansible-ansible.legacy.ping Invoked with data=pong
Jan 23 09:54:13 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'crash'
Jan 23 09:54:13 compute-1 ceph-mgr[80432]: mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:54:13 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'dashboard'
Jan 23 09:54:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:13.435+0000 7f26bfd6e140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-1 python3.9[86666]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'devicehealth'
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'diskprediction_local'
Jan 23 09:54:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:14.112+0000 7f26bfd6e140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Jan 23 09:54:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Jan 23 09:54:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]:   from numpy import show_config as show_numpy_config
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'influx'
Jan 23 09:54:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:14.280+0000 7f26bfd6e140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'insights'
Jan 23 09:54:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:14.348+0000 7f26bfd6e140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'iostat'
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'k8sevents'
Jan 23 09:54:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:14.477+0000 7f26bfd6e140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Jan 23 09:54:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:14.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:14.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'localpool'
Jan 23 09:54:14 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mds_autoscaler'
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'mirroring'
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'nfs'
Jan 23 09:54:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.460+0000 7f26bfd6e140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'orchestrator'
Jan 23 09:54:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.688+0000 7f26bfd6e140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_perf_query'
Jan 23 09:54:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.759+0000 7f26bfd6e140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'osd_support'
Jan 23 09:54:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.821+0000 7f26bfd6e140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'pg_autoscaler'
Jan 23 09:54:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.893+0000 7f26bfd6e140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'progress'
Jan 23 09:54:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:15.966+0000 7f26bfd6e140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Module progress has missing NOTIFY_TYPES member
Jan 23 09:54:15 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'prometheus'
Jan 23 09:54:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:16.321+0000 7f26bfd6e140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-1 ceph-mgr[80432]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rbd_support'
Jan 23 09:54:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:16.410+0000 7f26bfd6e140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-1 ceph-mgr[80432]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'restful'
Jan 23 09:54:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:16.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:16 compute-1 ceph-mon[80126]: from='mgr.14352 192.168.122.100:0/2738770404' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished
Jan 23 09:54:16 compute-1 ceph-mon[80126]: mgrmap e26: compute-0.nbdygh(active, since 2m), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.597773) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056597880, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1002, "num_deletes": 251, "total_data_size": 2240223, "memory_usage": 2276272, "flush_reason": "Manual Compaction"}
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056616142, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 1428848, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 7466, "largest_seqno": 8463, "table_properties": {"data_size": 1424071, "index_size": 2301, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10217, "raw_average_key_size": 18, "raw_value_size": 1414138, "raw_average_value_size": 2566, "num_data_blocks": 102, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162023, "oldest_key_time": 1769162023, "file_creation_time": 1769162056, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 18419 microseconds, and 6502 cpu microseconds.
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.616203) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 1428848 bytes OK
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.616229) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.617827) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.617880) EVENT_LOG_v1 {"time_micros": 1769162056617870, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.617904) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 2235010, prev total WAL file size 2235010, number of live WAL files 2.
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.619164) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760030' seq:72057594037927935, type:22 .. '6B7600323532' seq:0, type:0; will stop at (end)
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(1395KB)], [15(10MB)]
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056619265, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 12880306, "oldest_snapshot_seqno": -1}
Jan 23 09:54:16 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rgw'
Jan 23 09:54:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:16.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 3712 keys, 12441169 bytes, temperature: kUnknown
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056786078, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 12441169, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12410499, "index_size": 20320, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 9285, "raw_key_size": 94995, "raw_average_key_size": 25, "raw_value_size": 12336481, "raw_average_value_size": 3323, "num_data_blocks": 879, "num_entries": 3712, "num_filter_entries": 3712, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162056, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.786350) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 12441169 bytes
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.787569) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 77.2 rd, 74.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 10.9 +0.0 blob) out(11.9 +0.0 blob), read-write-amplify(17.7) write-amplify(8.7) OK, records in: 4239, records dropped: 527 output_compression: NoCompression
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.787592) EVENT_LOG_v1 {"time_micros": 1769162056787582, "job": 6, "event": "compaction_finished", "compaction_time_micros": 166887, "compaction_time_cpu_micros": 32208, "output_level": 6, "num_output_files": 1, "total_output_size": 12441169, "num_input_records": 4239, "num_output_records": 3712, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056787967, "job": 6, "event": "table_file_deletion", "file_number": 17}
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162056790216, "job": 6, "event": "table_file_deletion", "file_number": 15}
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.619045) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790244) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790249) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790253) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:54:16.790255) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:54:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:16.861+0000 7f26bfd6e140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-1 ceph-mgr[80432]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Jan 23 09:54:16 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'rook'
Jan 23 09:54:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:54:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:54:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:54:17 compute-1 sudo[86821]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gzhnnnztcopxpgctuemdcbkaltphxrbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162056.6436183-89-268477628752128/AnsiballZ_command.py'
Jan 23 09:54:17 compute-1 sudo[86821]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:17 compute-1 python3.9[86823]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:54:17 compute-1 sudo[86821]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.429+0000 7f26bfd6e140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Module rook has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'selftest'
Jan 23 09:54:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.501+0000 7f26bfd6e140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'snap_schedule'
Jan 23 09:54:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.578+0000 7f26bfd6e140 -1 mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Module snap_schedule has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'stats'
Jan 23 09:54:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'status'
Jan 23 09:54:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.730+0000 7f26bfd6e140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Module status has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telegraf'
Jan 23 09:54:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.808+0000 7f26bfd6e140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'telemetry'
Jan 23 09:54:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:17.981+0000 7f26bfd6e140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Jan 23 09:54:17 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'test_orchestrator'
Jan 23 09:54:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:18.220+0000 7f26bfd6e140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'volumes'
Jan 23 09:54:18 compute-1 sudo[86975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzhrerudpkhcqnbymiyzdjhuyuekinka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162057.8332152-125-202955514271245/AnsiballZ_stat.py'
Jan 23 09:54:18 compute-1 sudo[86975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:18 compute-1 python3.9[86977]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:54:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:18.515+0000 7f26bfd6e140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: mgr[py] Loading python module 'zabbix'
Jan 23 09:54:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:18.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:18 compute-1 sudo[86975]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 2026-01-23T09:54:18.590+0000 7f26bfd6e140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [dashboard DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: mgr load Constructed class from module: dashboard
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [prometheus DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: mgr load Constructed class from module: prometheus
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [prometheus INFO root] server_addr: :: server_port: 9283
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [prometheus INFO root] Starting engine...
Jan 23 09:54:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: [23/Jan/2026:09:54:18] ENGINE Bus STARTING
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [dashboard INFO root] server: ssl=no host=:: port=8443
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:18] ENGINE Bus STARTING
Jan 23 09:54:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: CherryPy Checker:
Jan 23 09:54:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: The Application mounted at '' has an empty config.
Jan 23 09:54:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: 
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [dashboard INFO root] Configured CherryPy, starting engine...
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: ms_deliver_dispatch: unhandled message 0x55fdcbbb1860 mon_map magic: 0 from mon.2 v2:192.168.122.101:3300/0
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [dashboard INFO root] Starting engine...
Jan 23 09:54:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: [23/Jan/2026:09:54:18] ENGINE Serving on http://:::9283
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [dashboard INFO root] Engine started...
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:18] ENGINE Serving on http://:::9283
Jan 23 09:54:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-mgr-compute-1-jmakme[80428]: [23/Jan/2026:09:54:18] ENGINE Bus STARTED
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [prometheus INFO cherrypy.error] [23/Jan/2026:09:54:18] ENGINE Bus STARTED
Jan 23 09:54:18 compute-1 ceph-mgr[80432]: [prometheus INFO root] Engine started.
Jan 23 09:54:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:18.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:18 compute-1 ceph-mon[80126]: Standby manager daemon compute-1.jmakme restarted
Jan 23 09:54:18 compute-1 ceph-mon[80126]: Standby manager daemon compute-1.jmakme started
Jan 23 09:54:19 compute-1 sudo[87153]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-agznqeqhnwynocwxxocrjjmqgyduibhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162058.83353-158-142896975666379/AnsiballZ_file.py'
Jan 23 09:54:19 compute-1 sudo[87153]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:19 compute-1 python3.9[87155]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:54:19 compute-1 sudo[87153]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e118 e118: 3 total, 3 up, 3 in
Jan 23 09:54:19 compute-1 ceph-mon[80126]: mgrmap e27: compute-0.nbdygh(active, since 2m), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:19 compute-1 ceph-mon[80126]: Active manager daemon compute-0.nbdygh restarted
Jan 23 09:54:19 compute-1 ceph-mon[80126]: Activating manager daemon compute-0.nbdygh
Jan 23 09:54:20 compute-1 sudo[87306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ormqewjdbyaotfhzkudppyuhzeauvxua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162059.7815623-185-172019652750683/AnsiballZ_file.py'
Jan 23 09:54:20 compute-1 sudo[87306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:20 compute-1 sshd-session[87309]: Accepted publickey for ceph-admin from 192.168.122.100 port 46188 ssh2: RSA SHA256:KUDiO2K/X1wi9imZiH/VfiDaYgPU2ishZ01Sxv0ziUk
Jan 23 09:54:20 compute-1 python3.9[87308]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:54:20 compute-1 systemd-logind[807]: New session 38 of user ceph-admin.
Jan 23 09:54:20 compute-1 systemd[1]: Started Session 38 of User ceph-admin.
Jan 23 09:54:20 compute-1 sshd-session[87309]: pam_unix(sshd:session): session opened for user ceph-admin(uid=42477) by ceph-admin(uid=0)
Jan 23 09:54:20 compute-1 sudo[87306]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:20 compute-1 sudo[87341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:20 compute-1 sudo[87341]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:20.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:20 compute-1 sudo[87341]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:20 compute-1 sudo[87397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:54:20 compute-1 sudo[87397]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:20.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:20 compute-1 ceph-mon[80126]: osdmap e118: 3 total, 3 up, 3 in
Jan 23 09:54:20 compute-1 ceph-mon[80126]: mgrmap e28: compute-0.nbdygh(active, starting, since 0.0302434s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-0"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-1"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata", "id": "compute-2"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-0.ymknms"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-2.prgzmm"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata", "who": "cephfs.compute-1.bcvzvj"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-0.nbdygh", "id": "compute-0.nbdygh"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-2.uczrot", "id": "compute-2.uczrot"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr metadata", "who": "compute-1.jmakme", "id": "compute-1.jmakme"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mds metadata"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mon metadata"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: Manager daemon compute-0.nbdygh is now available
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/mirror_snapshot_schedule"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/compute-0.nbdygh/trash_purge_schedule"}]: dispatch
Jan 23 09:54:20 compute-1 ceph-mon[80126]: Standby manager daemon compute-2.uczrot restarted
Jan 23 09:54:20 compute-1 ceph-mon[80126]: Standby manager daemon compute-2.uczrot started
Jan 23 09:54:21 compute-1 python3.9[87552]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:54:21 compute-1 podman[87583]: 2026-01-23 09:54:21.188940235 +0000 UTC m=+0.072641741 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2)
Jan 23 09:54:21 compute-1 network[87619]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:54:21 compute-1 network[87620]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:54:21 compute-1 network[87621]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:54:21 compute-1 podman[87583]: 2026-01-23 09:54:21.31837422 +0000 UTC m=+0.202075756 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS)
Jan 23 09:54:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095421 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:54:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [NOTICE] 022/095421 (4) : haproxy version is 2.3.17-d1c9119
Jan 23 09:54:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [NOTICE] 022/095421 (4) : path to executable is /usr/local/sbin/haproxy
Jan 23 09:54:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [ALERT] 022/095421 (4) : backend 'backend' has no server available!
Jan 23 09:54:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e119 e119: 3 total, 3 up, 3 in
Jan 23 09:54:22 compute-1 ceph-mon[80126]: mgrmap e29: compute-0.nbdygh(active, since 1.08581s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:22 compute-1 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Bus STARTING
Jan 23 09:54:22 compute-1 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Serving on http://192.168.122.100:8765
Jan 23 09:54:22 compute-1 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Serving on https://192.168.122.100:7150
Jan 23 09:54:22 compute-1 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Bus STARTED
Jan 23 09:54:22 compute-1 ceph-mon[80126]: [23/Jan/2026:09:54:21] ENGINE Client ('192.168.122.100', 52034) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Jan 23 09:54:22 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]: dispatch
Jan 23 09:54:22 compute-1 podman[87747]: 2026-01-23 09:54:22.332716862 +0000 UTC m=+0.062650119 container exec 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:22 compute-1 podman[87747]: 2026-01-23 09:54:22.36915603 +0000 UTC m=+0.099089237 container exec_died 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:22.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:22 compute-1 podman[87859]: 2026-01-23 09:54:22.687365606 +0000 UTC m=+0.055565038 container exec 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 09:54:22 compute-1 podman[87859]: 2026-01-23 09:54:22.700732434 +0000 UTC m=+0.068931866 container exec_died 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 09:54:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:22.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:22 compute-1 podman[87927]: 2026-01-23 09:54:22.924282131 +0000 UTC m=+0.053441262 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 09:54:22 compute-1 podman[87927]: 2026-01-23 09:54:22.933654364 +0000 UTC m=+0.062813505 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 09:54:23 compute-1 podman[88004]: 2026-01-23 09:54:23.153342759 +0000 UTC m=+0.056827747 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2023-02-22T09:23:20, io.openshift.expose-services=, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, com.redhat.component=keepalived-container, description=keepalived for Ceph, version=2.2.4, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.buildah.version=1.28.2, name=keepalived, io.openshift.tags=Ceph keepalived, architecture=x86_64, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793)
Jan 23 09:54:23 compute-1 podman[88004]: 2026-01-23 09:54:23.190940135 +0000 UTC m=+0.094425083 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=keepalived-container, io.openshift.tags=Ceph keepalived, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, io.buildah.version=1.28.2, name=keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-type=git, description=keepalived for Ceph, version=2.2.4, build-date=2023-02-22T09:23:20, io.openshift.expose-services=)
Jan 23 09:54:23 compute-1 sudo[87397]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:23 compute-1 ceph-mon[80126]: pgmap v4: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:54:23 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "23"}]': finished
Jan 23 09:54:23 compute-1 ceph-mon[80126]: osdmap e119: 3 total, 3 up, 3 in
Jan 23 09:54:23 compute-1 ceph-mon[80126]: mgrmap e30: compute-0.nbdygh(active, since 2s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:54:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:54:23 compute-1 sudo[88081]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:23 compute-1 sudo[88081]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:23 compute-1 sudo[88081]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:23 compute-1 sudo[88109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:54:23 compute-1 sudo[88109]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa538000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:24 compute-1 sudo[88109]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:24 compute-1 sudo[88247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:24 compute-1 sudo[88247]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:24 compute-1 sudo[88247]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:24.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:24 compute-1 sudo[88295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 09:54:24 compute-1 sudo[88295]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:24.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:24 compute-1 python3.9[88370]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:54:24 compute-1 sudo[88295]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:24 compute-1 ceph-mon[80126]: pgmap v6: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:54:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]: dispatch
Jan 23 09:54:25 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e120 e120: 3 total, 3 up, 3 in
Jan 23 09:54:25 compute-1 python3.9[88537]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:54:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095426 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:54:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "24"}]': finished
Jan 23 09:54:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-1 ceph-mon[80126]: osdmap e120: 3 total, 3 up, 3 in
Jan 23 09:54:26 compute-1 ceph-mon[80126]: mgrmap e31: compute-0.nbdygh(active, since 5s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:54:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 09:54:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]: dispatch
Jan 23 09:54:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528000f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:26.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:26 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e121 e121: 3 total, 3 up, 3 in
Jan 23 09:54:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:54:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:54:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:26.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:27 compute-1 python3.9[88692]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:54:27 compute-1 ceph-mon[80126]: pgmap v8: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:54:27 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "25"}]': finished
Jan 23 09:54:27 compute-1 ceph-mon[80126]: osdmap e121: 3 total, 3 up, 3 in
Jan 23 09:54:27 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:27 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:27 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 09:54:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Jan 23 09:54:27 compute-1 sudo[88747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:54:27 compute-1 sudo[88747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:27 compute-1 sudo[88747]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:27 compute-1 sudo[88801]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:54:27 compute-1 sudo[88801]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:27 compute-1 sudo[88801]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:27 compute-1 sudo[88849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:54:27 compute-1 sudo[88849]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:27 compute-1 sudo[88849]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 sudo[88887]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:28 compute-1 sudo[88887]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[88887]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 sudo[88958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynriwlfspaucfpiwzhjlzivuzversioa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162067.7941544-329-213643985122366/AnsiballZ_setup.py'
Jan 23 09:54:28 compute-1 sudo[88958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:28 compute-1 sudo[88943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:54:28 compute-1 sudo[88943]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[88943]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:28 compute-1 sudo[89000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:54:28 compute-1 sudo[89000]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[89000]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 sudo[89025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new
Jan 23 09:54:28 compute-1 sudo[89025]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[89025]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 sudo[89050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Jan 23 09:54:28 compute-1 sudo[89050]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[89050]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 sudo[89075]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:54:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:28 compute-1 sudo[89075]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 python3.9[88974]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:54:28 compute-1 sudo[89075]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 sudo[89103]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:54:28 compute-1 sudo[89103]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[89103]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:28.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:28 compute-1 sudo[89131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:54:28 compute-1 sudo[89131]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[89131]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 sudo[89156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:28 compute-1 sudo[89156]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[89156]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 sudo[88958]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 sudo[89183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:54:28 compute-1 sudo[89183]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[89183]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:28.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 09:54:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:54:28 compute-1 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.conf
Jan 23 09:54:28 compute-1 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.conf
Jan 23 09:54:28 compute-1 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.conf
Jan 23 09:54:28 compute-1 ceph-mon[80126]: pgmap v10: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 682 B/s wr, 14 op/s
Jan 23 09:54:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]: dispatch
Jan 23 09:54:28 compute-1 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:54:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e122 e122: 3 total, 3 up, 3 in
Jan 23 09:54:28 compute-1 sudo[89231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:54:28 compute-1 sudo[89231]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[89231]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:28 compute-1 sudo[89256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new
Jan 23 09:54:28 compute-1 sudo[89256]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:28 compute-1 sudo[89256]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:54:29 compute-1 sudo[89282]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89282]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Jan 23 09:54:29 compute-1 sudo[89331]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89331]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srjyshqjobquonyzifrotattgfurgvqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162067.7941544-329-213643985122366/AnsiballZ_dnf.py'
Jan 23 09:54:29 compute-1 sudo[89425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:54:29 compute-1 sudo[89384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph
Jan 23 09:54:29 compute-1 sudo[89384]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89384]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:54:29 compute-1 sudo[89432]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89432]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:29 compute-1 sudo[89457]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89457]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 122 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=122 pruub=14.288291931s) [1] r=-1 lpr=122 pi=[89,122)/1 crt=62'771 mlcod 0'0 active pruub 291.752838135s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:29 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 122 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=122 pruub=14.288235664s) [1] r=-1 lpr=122 pi=[89,122)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 291.752838135s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:29 compute-1 sudo[89482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:54:29 compute-1 sudo[89482]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89482]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 python3.9[89429]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:54:29 compute-1 sudo[89531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:54:29 compute-1 sudo[89531]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89531]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new
Jan 23 09:54:29 compute-1 sudo[89557]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89557]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Jan 23 09:54:29 compute-1 sudo[89582]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89582]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:54:29 compute-1 sudo[89607]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89607]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:29 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:54:29 compute-1 sudo[89633]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config
Jan 23 09:54:29 compute-1 sudo[89633]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89633]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89660]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:54:29 compute-1 sudo[89660]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89660]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:54:29 compute-1 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.conf
Jan 23 09:54:29 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "26"}]': finished
Jan 23 09:54:29 compute-1 ceph-mon[80126]: osdmap e122: 3 total, 3 up, 3 in
Jan 23 09:54:29 compute-1 ceph-mon[80126]: Updating compute-0:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:54:29 compute-1 ceph-mon[80126]: Updating compute-1:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:54:29 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e123 e123: 3 total, 3 up, 3 in
Jan 23 09:54:29 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 123 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=123) [1]/[0] r=0 lpr=123 pi=[89,123)/1 crt=62'771 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:29 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 123 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=89/90 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=123) [1]/[0] r=0 lpr=123 pi=[89,123)/1 crt=62'771 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:29 compute-1 sudo[89688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:29 compute-1 sudo[89688]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89688]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:29 compute-1 sudo[89716]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:54:29 compute-1 sudo[89716]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:29 compute-1 sudo[89716]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-1 sudo[89767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:54:30 compute-1 sudo[89767]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-1 sudo[89767]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-1 sudo[89792]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new
Jan 23 09:54:30 compute-1 sudo[89792]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-1 sudo[89792]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-1 sudo[89820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-f3005f84-239a-55b6-a948-8f1fb592b920/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring.new /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:54:30 compute-1 sudo[89820]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:30 compute-1 sudo[89820]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:30.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 09:54:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:30.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 09:54:30 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e124 e124: 3 total, 3 up, 3 in
Jan 23 09:54:30 compute-1 ceph-mon[80126]: Updating compute-0:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:54:30 compute-1 ceph-mon[80126]: Updating compute-1:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:54:30 compute-1 ceph-mon[80126]: Updating compute-2:/etc/ceph/ceph.client.admin.keyring
Jan 23 09:54:30 compute-1 ceph-mon[80126]: osdmap e123: 3 total, 3 up, 3 in
Jan 23 09:54:30 compute-1 ceph-mon[80126]: pgmap v13: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 880 B/s wr, 18 op/s
Jan 23 09:54:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 09:54:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-1 ceph-mon[80126]: Updating compute-2:/var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/config/ceph.client.admin.keyring
Jan 23 09:54:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]: dispatch
Jan 23 09:54:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:54:30 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 124 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=123/124 n=7 ec=59/46 lis/c=89/89 les/c/f=90/90/0 sis=123) [1]/[0] async=[1] r=0 lpr=123 pi=[89,123)/1 crt=62'771 mlcod 0'0 active+remapped mbc={255={(0+1)=7}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:54:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:30 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e125 e125: 3 total, 3 up, 3 in
Jan 23 09:54:30 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 125 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=123/124 n=7 ec=59/46 lis/c=123/89 les/c/f=124/90/0 sis=125 pruub=15.890355110s) [1] async=[1] r=-1 lpr=125 pi=[89,125)/1 crt=62'771 mlcod 62'771 active pruub 295.053192139s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:30 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 125 pg[10.19( v 62'771 (0'0,62'771] local-lis/les=123/124 n=7 ec=59/46 lis/c=123/89 les/c/f=124/90/0 sis=125 pruub=15.890196800s) [1] r=-1 lpr=125 pi=[89,125)/1 crt=62'771 mlcod 0'0 unknown NOTIFY pruub 295.053192139s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:31 compute-1 ceph-mon[80126]: pgmap v14: 353 pgs: 353 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 836 B/s wr, 17 op/s
Jan 23 09:54:31 compute-1 ceph-mon[80126]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 09:54:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 09:54:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "27"}]': finished
Jan 23 09:54:31 compute-1 ceph-mon[80126]: osdmap e124: 3 total, 3 up, 3 in
Jan 23 09:54:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:54:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:54:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:31 compute-1 ceph-mon[80126]: osdmap e125: 3 total, 3 up, 3 in
Jan 23 09:54:31 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e126 e126: 3 total, 3 up, 3 in
Jan 23 09:54:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5300029b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:32.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:32.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095432 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:54:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:32 compute-1 ceph-mon[80126]: osdmap e126: 3 total, 3 up, 3 in
Jan 23 09:54:32 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]: dispatch
Jan 23 09:54:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e127 e127: 3 total, 3 up, 3 in
Jan 23 09:54:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:54:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:54:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:54:33 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 127 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=127 pruub=15.797190666s) [1] r=-1 lpr=127 pi=[92,127)/1 crt=60'756 mlcod 0'0 active pruub 297.948699951s@ mbc={}] PeeringState::start_peering_interval up [0] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 0 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:33 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 127 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=127 pruub=15.797089577s) [1] r=-1 lpr=127 pi=[92,127)/1 crt=60'756 mlcod 0'0 unknown NOTIFY pruub 297.948699951s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:34 compute-1 ceph-mon[80126]: pgmap v18: 353 pgs: 1 active+recovering+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 6.7 KiB/s rd, 3.0 KiB/s wr, 10 op/s; 1/227 objects misplaced (0.441%); 36 B/s, 2 objects/s recovering
Jan 23 09:54:34 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "28"}]': finished
Jan 23 09:54:34 compute-1 ceph-mon[80126]: osdmap e127: 3 total, 3 up, 3 in
Jan 23 09:54:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e128 e128: 3 total, 3 up, 3 in
Jan 23 09:54:34 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 128 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=128) [1]/[0] r=0 lpr=128 pi=[92,128)/1 crt=60'756 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [1] -> [1], acting [1] -> [0], acting_primary 1 -> 0, up_primary 1 -> 1, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:34 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 128 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=92/93 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=128) [1]/[0] r=0 lpr=128 pi=[92,128)/1 crt=60'756 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c0016e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5300032d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:34.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:34.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:35 compute-1 ceph-mon[80126]: osdmap e128: 3 total, 3 up, 3 in
Jan 23 09:54:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]: dispatch
Jan 23 09:54:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:54:35 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e129 e129: 3 total, 3 up, 3 in
Jan 23 09:54:35 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 129 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=128/129 n=2 ec=59/46 lis/c=92/92 les/c/f=93/93/0 sis=128) [1]/[0] async=[1] r=0 lpr=128 pi=[92,128)/1 crt=60'756 mlcod 0'0 active+remapped mbc={255={(0+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:54:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:36 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e130 e130: 3 total, 3 up, 3 in
Jan 23 09:54:36 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 130 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=128/129 n=2 ec=59/46 lis/c=128/92 les/c/f=129/93/0 sis=130 pruub=14.894608498s) [1] async=[1] r=-1 lpr=130 pi=[92,130)/1 crt=60'756 mlcod 60'756 active pruub 299.372436523s@ mbc={255={}}] PeeringState::start_peering_interval up [1] -> [1], acting [0] -> [1], acting_primary 0 -> 1, up_primary 1 -> 1, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:36 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 130 pg[10.1b( v 60'756 (0'0,60'756] local-lis/les=128/129 n=2 ec=59/46 lis/c=128/92 les/c/f=129/93/0 sis=130 pruub=14.894343376s) [1] r=-1 lpr=130 pi=[92,130)/1 crt=60'756 mlcod 0'0 unknown NOTIFY pruub 299.372436523s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:36 compute-1 ceph-mon[80126]: pgmap v21: 353 pgs: 1 active+recovering+remapped, 352 active+clean; 457 KiB data, 148 MiB used, 60 GiB / 60 GiB avail; 5.0 KiB/s rd, 2.3 KiB/s wr, 7 op/s; 1/227 objects misplaced (0.441%); 27 B/s, 1 objects/s recovering
Jan 23 09:54:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "29"}]': finished
Jan 23 09:54:36 compute-1 ceph-mon[80126]: osdmap e129: 3 total, 3 up, 3 in
Jan 23 09:54:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528001ab0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:36.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:36.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:36 compute-1 sudo[89902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:54:36 compute-1 sudo[89902]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:36 compute-1 sudo[89902]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:36 compute-1 sudo[89927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:54:36 compute-1 sudo[89927]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:36 compute-1 sudo[89927]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:37 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:54:37 compute-1 ceph-mon[80126]: osdmap e130: 3 total, 3 up, 3 in
Jan 23 09:54:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]: dispatch
Jan 23 09:54:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 09:54:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 23 09:54:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e131 e131: 3 total, 3 up, 3 in
Jan 23 09:54:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e131 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c000e00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:38 compute-1 ceph-mon[80126]: pgmap v24: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 4.0 KiB/s rd, 1023 B/s wr, 4 op/s; 27 B/s, 0 objects/s recovering
Jan 23 09:54:38 compute-1 ceph-mon[80126]: Reconfiguring mon.compute-0 (monmap changed)...
Jan 23 09:54:38 compute-1 ceph-mon[80126]: Reconfiguring daemon mon.compute-0 on compute-0
Jan 23 09:54:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "30"}]': finished
Jan 23 09:54:38 compute-1 ceph-mon[80126]: osdmap e131: 3 total, 3 up, 3 in
Jan 23 09:54:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.compute-0.nbdygh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch
Jan 23 09:54:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 09:54:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:38.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:38.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:39 compute-1 ceph-mon[80126]: Reconfiguring mgr.compute-0.nbdygh (monmap changed)...
Jan 23 09:54:39 compute-1 ceph-mon[80126]: Reconfiguring daemon mgr.compute-0.nbdygh on compute-0
Jan 23 09:54:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-0", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 09:54:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]: dispatch
Jan 23 09:54:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:39 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e132 e132: 3 total, 3 up, 3 in
Jan 23 09:54:39 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 132 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=132 pruub=10.343003273s) [2] r=-1 lpr=132 pi=[80,132)/1 crt=62'763 mlcod 0'0 active pruub 298.424377441s@ mbc={}] PeeringState::start_peering_interval up [0] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:39 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 132 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=132 pruub=10.342778206s) [2] r=-1 lpr=132 pi=[80,132)/1 crt=62'763 mlcod 0'0 unknown NOTIFY pruub 298.424377441s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5100016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c001920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:40 compute-1 ceph-mon[80126]: Reconfiguring crash.compute-0 (monmap changed)...
Jan 23 09:54:40 compute-1 ceph-mon[80126]: Reconfiguring daemon crash.compute-0 on compute-0
Jan 23 09:54:40 compute-1 ceph-mon[80126]: pgmap v26: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.5 KiB/s rd, 883 B/s wr, 4 op/s; 23 B/s, 0 objects/s recovering
Jan 23 09:54:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:40 compute-1 ceph-mon[80126]: Reconfiguring osd.1 (monmap changed)...
Jan 23 09:54:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch
Jan 23 09:54:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:40 compute-1 ceph-mon[80126]: Reconfiguring daemon osd.1 on compute-0
Jan 23 09:54:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "31"}]': finished
Jan 23 09:54:40 compute-1 ceph-mon[80126]: osdmap e132: 3 total, 3 up, 3 in
Jan 23 09:54:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:40.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e133 e133: 3 total, 3 up, 3 in
Jan 23 09:54:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 133 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] r=0 lpr=133 pi=[80,133)/1 crt=62'763 mlcod 0'0 remapped NOTIFY mbc={}] PeeringState::start_peering_interval up [2] -> [2], acting [2] -> [0], acting_primary 2 -> 0, up_primary 2 -> 2, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:40 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 133 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=80/81 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] r=0 lpr=133 pi=[80,133)/1 crt=62'763 mlcod 0'0 remapped mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:40.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:41 compute-1 ceph-mon[80126]: Reconfiguring alertmanager.compute-0 (dependencies changed)...
Jan 23 09:54:41 compute-1 ceph-mon[80126]: Reconfiguring daemon alertmanager.compute-0 on compute-0
Jan 23 09:54:41 compute-1 ceph-mon[80126]: osdmap e133: 3 total, 3 up, 3 in
Jan 23 09:54:41 compute-1 ceph-mon[80126]: pgmap v29: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail
Jan 23 09:54:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]: dispatch
Jan 23 09:54:41 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e134 e134: 3 total, 3 up, 3 in
Jan 23 09:54:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 134 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=134) [0] r=0 lpr=134 pi=[103,134)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:41 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 134 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=133/134 n=5 ec=59/46 lis/c=80/80 les/c/f=81/81/0 sis=133) [2]/[0] async=[2] r=0 lpr=133 pi=[80,133)/1 crt=62'763 mlcod 0'0 active+remapped mbc={255={(0+1)=5}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:54:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5100016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:42.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:42.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c001920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e135 e135: 3 total, 3 up, 3 in
Jan 23 09:54:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 135 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=133/134 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135 pruub=14.816746712s) [2] async=[2] r=-1 lpr=135 pi=[80,135)/1 crt=62'763 mlcod 62'763 active pruub 306.180419922s@ mbc={255={}}] PeeringState::start_peering_interval up [2] -> [2], acting [0] -> [2], acting_primary 0 -> 2, up_primary 2 -> 2, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 135 pg[10.1e( v 62'763 (0'0,62'763] local-lis/les=133/134 n=5 ec=59/46 lis/c=133/80 les/c/f=134/81/0 sis=135 pruub=14.816671371s) [2] r=-1 lpr=135 pi=[80,135)/1 crt=62'763 mlcod 0'0 unknown NOTIFY pruub 306.180419922s@ mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 135 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] r=-1 lpr=135 pi=[103,135)/1 crt=0'0 mlcod 0'0 remapped mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [0] -> [2], acting_primary 0 -> 2, up_primary 0 -> 0, role 0 -> -1, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:43 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 135 pg[10.1f( empty local-lis/les=0/0 n=0 ec=59/46 lis/c=103/103 les/c/f=104/104/0 sis=135) [0]/[2] r=-1 lpr=135 pi=[103,135)/1 crt=0'0 mlcod 0'0 remapped NOTIFY mbc={}] state<Start>: transitioning to Stray
Jan 23 09:54:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095443 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:54:43 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd='[{"prefix": "osd pool set", "pool": "default.rgw.log", "var": "pgp_num_actual", "val": "32"}]': finished
Jan 23 09:54:43 compute-1 ceph-mon[80126]: osdmap e134: 3 total, 3 up, 3 in
Jan 23 09:54:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:44.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:44.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5100016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:45 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e136 e136: 3 total, 3 up, 3 in
Jan 23 09:54:45 compute-1 ceph-mon[80126]: pgmap v31: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 564 B/s rd, 188 B/s wr, 1 op/s
Jan 23 09:54:45 compute-1 ceph-mon[80126]: osdmap e135: 3 total, 3 up, 3 in
Jan 23 09:54:46 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e137 e137: 3 total, 3 up, 3 in
Jan 23 09:54:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 137 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137) [0] r=0 lpr=137 pi=[103,137)/1 luod=0'0 crt=62'761 mlcod 0'0 active mbc={}] PeeringState::start_peering_interval up [0] -> [0], acting [2] -> [0], acting_primary 2 -> 0, up_primary 0 -> 0, role -1 -> 0, features acting 4540701547738038271 upacting 4540701547738038271
Jan 23 09:54:46 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 137 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=0/0 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137) [0] r=0 lpr=137 pi=[103,137)/1 crt=62'761 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Jan 23 09:54:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c001920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:46.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:46 compute-1 ceph-mon[80126]: pgmap v33: 353 pgs: 1 remapped+peering, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 588 B/s rd, 196 B/s wr, 1 op/s
Jan 23 09:54:46 compute-1 ceph-mon[80126]: osdmap e136: 3 total, 3 up, 3 in
Jan 23 09:54:46 compute-1 ceph-mon[80126]: osdmap e137: 3 total, 3 up, 3 in
Jan 23 09:54:46 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:46 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:46 compute-1 ceph-mon[80126]: Reconfiguring grafana.compute-0 (dependencies changed)...
Jan 23 09:54:46 compute-1 ceph-mon[80126]: Reconfiguring daemon grafana.compute-0 on compute-0
Jan 23 09:54:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:46.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 e138: 3 total, 3 up, 3 in
Jan 23 09:54:47 compute-1 ceph-osd[77616]: osd.0 pg_epoch: 138 pg[10.1f( v 62'761 (0'0,62'761] local-lis/les=137/138 n=5 ec=59/46 lis/c=135/103 les/c/f=136/104/0 sis=137) [0] r=0 lpr=137 pi=[103,137)/1 crt=62'761 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Jan 23 09:54:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:48 compute-1 ceph-mon[80126]: pgmap v36: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 23 B/s, 2 objects/s recovering
Jan 23 09:54:48 compute-1 ceph-mon[80126]: osdmap e138: 3 total, 3 up, 3 in
Jan 23 09:54:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:48 compute-1 sudo[90005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:48 compute-1 sudo[90005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:48 compute-1 sudo[90005]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:48 compute-1 sudo[90030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:48 compute-1 sudo[90030]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c002db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:48.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:48.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:48 compute-1 podman[90071]: 2026-01-23 09:54:48.811204544 +0000 UTC m=+0.072750805 container create 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 09:54:48 compute-1 systemd[1]: Started libpod-conmon-19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401.scope.
Jan 23 09:54:48 compute-1 podman[90071]: 2026-01-23 09:54:48.783930872 +0000 UTC m=+0.045477183 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:54:48 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:54:48 compute-1 podman[90071]: 2026-01-23 09:54:48.908866256 +0000 UTC m=+0.170412527 container init 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid)
Jan 23 09:54:48 compute-1 podman[90071]: 2026-01-23 09:54:48.917863167 +0000 UTC m=+0.179409418 container start 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.build-date=20250325)
Jan 23 09:54:48 compute-1 podman[90071]: 2026-01-23 09:54:48.921429409 +0000 UTC m=+0.182975750 container attach 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 09:54:48 compute-1 vigilant_lederberg[90088]: 167 167
Jan 23 09:54:48 compute-1 systemd[1]: libpod-19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401.scope: Deactivated successfully.
Jan 23 09:54:48 compute-1 conmon[90088]: conmon 19ee01b7379ed6f34fbc <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401.scope/container/memory.events
Jan 23 09:54:48 compute-1 podman[90071]: 2026-01-23 09:54:48.926229589 +0000 UTC m=+0.187775840 container died 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2)
Jan 23 09:54:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:48 compute-1 systemd[1]: var-lib-containers-storage-overlay-31799003e0affa5ff836fc49d3057b4f51eef1f354c9380b2fea3a2c64b052d6-merged.mount: Deactivated successfully.
Jan 23 09:54:48 compute-1 podman[90071]: 2026-01-23 09:54:48.973716443 +0000 UTC m=+0.235262694 container remove 19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_lederberg, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:54:48 compute-1 systemd[1]: libpod-conmon-19ee01b7379ed6f34fbc1c109aaf9fac21e0ee1c70bb0cbacfab1f0c68214401.scope: Deactivated successfully.
Jan 23 09:54:49 compute-1 sudo[90030]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:49 compute-1 sudo[90107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:49 compute-1 sudo[90107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:49 compute-1 sudo[90107]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:49 compute-1 sudo[90132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:49 compute-1 sudo[90132]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:49 compute-1 ceph-mon[80126]: Reconfiguring crash.compute-1 (monmap changed)...
Jan 23 09:54:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-1", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 09:54:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:49 compute-1 ceph-mon[80126]: Reconfiguring daemon crash.compute-1 on compute-1
Jan 23 09:54:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch
Jan 23 09:54:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:49 compute-1 podman[90174]: 2026-01-23 09:54:49.543375588 +0000 UTC m=+0.046772314 container create c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 09:54:49 compute-1 systemd[1]: Started libpod-conmon-c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d.scope.
Jan 23 09:54:49 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:54:49 compute-1 podman[90174]: 2026-01-23 09:54:49.523127544 +0000 UTC m=+0.026524300 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:54:49 compute-1 podman[90174]: 2026-01-23 09:54:49.625835164 +0000 UTC m=+0.129231980 container init c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.vendor=CentOS)
Jan 23 09:54:49 compute-1 podman[90174]: 2026-01-23 09:54:49.631723968 +0000 UTC m=+0.135120694 container start c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 09:54:49 compute-1 suspicious_germain[90189]: 167 167
Jan 23 09:54:49 compute-1 systemd[1]: libpod-c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d.scope: Deactivated successfully.
Jan 23 09:54:49 compute-1 podman[90174]: 2026-01-23 09:54:49.636416655 +0000 UTC m=+0.139813381 container attach c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default)
Jan 23 09:54:49 compute-1 podman[90174]: 2026-01-23 09:54:49.636983783 +0000 UTC m=+0.140380509 container died c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 09:54:49 compute-1 systemd[1]: var-lib-containers-storage-overlay-f305bf7e8e7cb32a61126d20d70c5e49e56dd18c574b01be6a816ddedbbf10a8-merged.mount: Deactivated successfully.
Jan 23 09:54:49 compute-1 podman[90174]: 2026-01-23 09:54:49.675583729 +0000 UTC m=+0.178980455 container remove c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=suspicious_germain, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 09:54:49 compute-1 systemd[1]: libpod-conmon-c848be5dfb9bac2c4b6030fe512edf950aa5ca8b7d0cbf39d8138d230601233d.scope: Deactivated successfully.
Jan 23 09:54:49 compute-1 sudo[90132]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:49 compute-1 sudo[90212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:49 compute-1 sudo[90212]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:49 compute-1 sudo[90212]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:49 compute-1 sudo[90237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:49 compute-1 sudo[90237]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:50 compute-1 podman[90280]: 2026-01-23 09:54:50.288250958 +0000 UTC m=+0.043812501 container create 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:54:50 compute-1 systemd[1]: Started libpod-conmon-6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf.scope.
Jan 23 09:54:50 compute-1 systemd[1]: Started libcrun container.
Jan 23 09:54:50 compute-1 podman[90280]: 2026-01-23 09:54:50.352385142 +0000 UTC m=+0.107946715 container init 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, OSD_FLAVOR=default)
Jan 23 09:54:50 compute-1 podman[90280]: 2026-01-23 09:54:50.358779161 +0000 UTC m=+0.114340734 container start 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0)
Jan 23 09:54:50 compute-1 vigilant_ishizaka[90296]: 167 167
Jan 23 09:54:50 compute-1 podman[90280]: 2026-01-23 09:54:50.268935523 +0000 UTC m=+0.024497096 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:54:50 compute-1 systemd[1]: libpod-6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf.scope: Deactivated successfully.
Jan 23 09:54:50 compute-1 podman[90280]: 2026-01-23 09:54:50.36480039 +0000 UTC m=+0.120361983 container attach 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2)
Jan 23 09:54:50 compute-1 podman[90280]: 2026-01-23 09:54:50.365083009 +0000 UTC m=+0.120644562 container died 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325)
Jan 23 09:54:50 compute-1 ceph-mon[80126]: pgmap v38: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 20 B/s, 1 objects/s recovering
Jan 23 09:54:50 compute-1 ceph-mon[80126]: Reconfiguring osd.0 (monmap changed)...
Jan 23 09:54:50 compute-1 ceph-mon[80126]: Reconfiguring daemon osd.0 on compute-1
Jan 23 09:54:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 09:54:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 23 09:54:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:54:50 compute-1 systemd[1]: var-lib-containers-storage-overlay-f947111ae5e2f892803a770e79cef77868f171642ad71c41b77171b2f2206cd5-merged.mount: Deactivated successfully.
Jan 23 09:54:50 compute-1 podman[90280]: 2026-01-23 09:54:50.407697991 +0000 UTC m=+0.163259534 container remove 6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=vigilant_ishizaka, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 09:54:50 compute-1 systemd[1]: libpod-conmon-6d61f0c5a479d74a6d91dee5106c234fb5331f292d8ca65deeb9504bd61e47bf.scope: Deactivated successfully.
Jan 23 09:54:50 compute-1 sudo[90237]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:50 compute-1 sudo[90312]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:50 compute-1 sudo[90312]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:50 compute-1 sudo[90312]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:50.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:50 compute-1 sudo[90337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/prometheus/node-exporter:v1.7.0 --timeout 895 _orch deploy --fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 09:54:50 compute-1 sudo[90337]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:54:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:50.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:54:50 compute-1 systemd[1]: Stopping Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:54:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c002db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:51 compute-1 podman[90411]: 2026-01-23 09:54:51.064071864 +0000 UTC m=+0.044340356 container died 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:51 compute-1 systemd[1]: var-lib-containers-storage-overlay-1a9cf7871bcaed94d55bf97d20a317e95aa8ecd54623be987a412d3816ee0ab4-merged.mount: Deactivated successfully.
Jan 23 09:54:51 compute-1 podman[90411]: 2026-01-23 09:54:51.107039718 +0000 UTC m=+0.087308210 container remove 965059b6604140ae839e574b936faab3cd21da5835a6853031ca4245ed8258a6 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:51 compute-1 bash[90411]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1
Jan 23 09:54:51 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@node-exporter.compute-1.service: Main process exited, code=exited, status=143/n/a
Jan 23 09:54:51 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@node-exporter.compute-1.service: Failed with result 'exit-code'.
Jan 23 09:54:51 compute-1 systemd[1]: Stopped Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:54:51 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@node-exporter.compute-1.service: Consumed 2.430s CPU time.
Jan 23 09:54:51 compute-1 systemd[1]: Starting Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:54:51 compute-1 ceph-mon[80126]: Reconfiguring mon.compute-1 (monmap changed)...
Jan 23 09:54:51 compute-1 ceph-mon[80126]: Reconfiguring daemon mon.compute-1 on compute-1
Jan 23 09:54:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:51 compute-1 podman[90516]: 2026-01-23 09:54:51.551307762 +0000 UTC m=+0.046136912 container create 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:51 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e14a3ca03b62f0376285049141ffbb302ed3ee029612a5beb01e4b4d9873ce85/merged/etc/node-exporter supports timestamps until 2038 (0x7fffffff)
Jan 23 09:54:51 compute-1 podman[90516]: 2026-01-23 09:54:51.614468537 +0000 UTC m=+0.109297707 container init 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:51 compute-1 podman[90516]: 2026-01-23 09:54:51.619771032 +0000 UTC m=+0.114600192 container start 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:51 compute-1 bash[90516]: 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78
Jan 23 09:54:51 compute-1 podman[90516]: 2026-01-23 09:54:51.533486506 +0000 UTC m=+0.028315676 image pull 72c9c208898624938c9e4183d6686ea4a5fd3f912bc29bc3f00147924c521a3e quay.io/prometheus/node-exporter:v1.7.0
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.629Z caller=node_exporter.go:192 level=info msg="Starting node_exporter" version="(version=1.7.0, branch=HEAD, revision=7333465abf9efba81876303bb57e6fadb946041b)"
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.629Z caller=node_exporter.go:193 level=info msg="Build context" build_context="(go=go1.21.4, platform=linux/amd64, user=root@35918982f6d8, date=20231112-23:53:35, tags=netgo osusergo static_build)"
Jan 23 09:54:51 compute-1 systemd[1]: Started Ceph node-exporter.compute-1 for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.632Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.632Z caller=diskstats_linux.go:265 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=arp
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=bcache
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=bonding
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=btrfs
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=conntrack
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=cpu
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=cpufreq
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=diskstats
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=dmi
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=edac
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=entropy
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=fibrechannel
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=filefd
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=filesystem
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=hwmon
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=infiniband
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=ipvs
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=loadavg
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=mdadm
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=meminfo
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=netclass
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.633Z caller=node_exporter.go:117 level=info collector=netdev
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=netstat
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=nfs
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=nfsd
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=nvme
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=os
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=powersupplyclass
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=pressure
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=rapl
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=schedstat
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=selinux
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=sockstat
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=softnet
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=stat
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=tapestats
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=textfile
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=thermal_zone
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=time
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=udp_queues
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=uname
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=vmstat
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=xfs
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.634Z caller=node_exporter.go:117 level=info collector=zfs
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.637Z caller=tls_config.go:274 level=info msg="Listening on" address=[::]:9100
Jan 23 09:54:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1[90531]: ts=2026-01-23T09:54:51.637Z caller=tls_config.go:277 level=info msg="TLS is disabled." http2=false address=[::]:9100
Jan 23 09:54:51 compute-1 sudo[90337]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:52 compute-1 ceph-mon[80126]: Reconfiguring node-exporter.compute-1 (unknown last config time)...
Jan 23 09:54:52 compute-1 ceph-mon[80126]: Reconfiguring daemon node-exporter.compute-1 on compute-1
Jan 23 09:54:52 compute-1 ceph-mon[80126]: pgmap v39: 353 pgs: 1 active+remapped, 352 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 18 B/s, 1 objects/s recovering
Jan 23 09:54:52 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:52 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:52 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch
Jan 23 09:54:52 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch
Jan 23 09:54:52 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:52.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:52.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:53 compute-1 ceph-mon[80126]: Reconfiguring mon.compute-2 (monmap changed)...
Jan 23 09:54:53 compute-1 ceph-mon[80126]: Reconfiguring daemon mon.compute-2 on compute-2
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.compute-2", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 23 09:54:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:53 compute-1 sudo[90543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:54:53 compute-1 sudo[90543]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:53 compute-1 sudo[90543]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:53 compute-1 sudo[90568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 09:54:53 compute-1 sudo[90568]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c002db0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:54 compute-1 podman[90673]: 2026-01-23 09:54:54.239258531 +0000 UTC m=+0.077409860 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:54:54 compute-1 podman[90673]: 2026-01-23 09:54:54.376985786 +0000 UTC m=+0.215137095 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:54:54 compute-1 ceph-mon[80126]: Reconfiguring crash.compute-2 (unknown last config time)...
Jan 23 09:54:54 compute-1 ceph-mon[80126]: Reconfiguring daemon crash.compute-2 on compute-2
Jan 23 09:54:54 compute-1 ceph-mon[80126]: pgmap v40: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 419 B/s rd, 0 op/s; 15 B/s, 1 objects/s recovering
Jan 23 09:54:54 compute-1 ceph-mon[80126]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch
Jan 23 09:54:54 compute-1 ceph-mon[80126]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch
Jan 23 09:54:54 compute-1 ceph-mon[80126]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://192.168.122.100:3000"}]: dispatch
Jan 23 09:54:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:54.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:54.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:54 compute-1 podman[90795]: 2026-01-23 09:54:54.857604726 +0000 UTC m=+0.062198444 container exec 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:54 compute-1 podman[90795]: 2026-01-23 09:54:54.865959787 +0000 UTC m=+0.070553505 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 09:54:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:55 compute-1 podman[90885]: 2026-01-23 09:54:55.2173277 +0000 UTC m=+0.050647314 container exec 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:54:55 compute-1 podman[90885]: 2026-01-23 09:54:55.230905094 +0000 UTC m=+0.064224708 container exec_died 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 09:54:55 compute-1 podman[90949]: 2026-01-23 09:54:55.425437514 +0000 UTC m=+0.053039099 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 09:54:55 compute-1 podman[90949]: 2026-01-23 09:54:55.43491291 +0000 UTC m=+0.062514465 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 09:54:55 compute-1 ceph-mon[80126]: pgmap v41: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 350 B/s rd, 0 op/s
Jan 23 09:54:55 compute-1 podman[91017]: 2026-01-23 09:54:55.690241 +0000 UTC m=+0.114863451 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, name=keepalived, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, io.openshift.tags=Ceph keepalived, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., io.buildah.version=1.28.2, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2023-02-22T09:23:20, com.redhat.component=keepalived-container, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, description=keepalived for Ceph)
Jan 23 09:54:55 compute-1 podman[91037]: 2026-01-23 09:54:55.781714348 +0000 UTC m=+0.070036160 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, release=1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, io.openshift.expose-services=, summary=Provides keepalived on RHEL 9 for Ceph., architecture=x86_64, distribution-scope=public, com.redhat.component=keepalived-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=keepalived, version=2.2.4, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, description=keepalived for Ceph, io.buildah.version=1.28.2)
Jan 23 09:54:55 compute-1 podman[91017]: 2026-01-23 09:54:55.787076586 +0000 UTC m=+0.211699027 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, io.buildah.version=1.28.2, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.component=keepalived-container, io.openshift.expose-services=, architecture=x86_64, build-date=2023-02-22T09:23:20, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=keepalived for Ceph, version=2.2.4, io.openshift.tags=Ceph keepalived, distribution-scope=public)
Jan 23 09:54:55 compute-1 sudo[90568]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:56.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:54:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:56.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:54:56 compute-1 sudo[91059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:54:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:56 compute-1 sudo[91059]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:54:56 compute-1 sudo[91059]: pam_unix(sudo:session): session closed for user root
Jan 23 09:54:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:54:57 compute-1 ceph-mon[80126]: pgmap v42: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Jan 23 09:54:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:54:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:54:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:54:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:54:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:54:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:54:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:54:58.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:54:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:54:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:54:58.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:54:58 compute-1 ceph-mon[80126]: pgmap v43: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 286 B/s rd, 0 op/s
Jan 23 09:54:58 compute-1 ceph-mon[80126]: Health check cleared: CEPHADM_FAILED_DAEMON (was: 1 failed cephadm daemon(s))
Jan 23 09:54:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:54:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:00.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:00.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:01 compute-1 ceph-mon[80126]: pgmap v44: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 279 B/s rd, 0 op/s
Jan 23 09:55:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:02.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:02.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:02 compute-1 sudo[91092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:55:02 compute-1 sudo[91092]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:55:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:02 compute-1 sudo[91092]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:03 compute-1 ceph-mon[80126]: pgmap v45: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 559 B/s rd, 0 op/s
Jan 23 09:55:03 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:55:03 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:55:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:04.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:04.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:05 compute-1 ceph-mon[80126]: pgmap v46: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 279 B/s rd, 0 op/s
Jan 23 09:55:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:55:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:55:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003d10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:06.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:55:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:06.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:55:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:07 compute-1 ceph-mon[80126]: pgmap v47: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 279 B/s rd, 0 op/s
Jan 23 09:55:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003d30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:08.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:55:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:08.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:55:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:09 compute-1 ceph-mon[80126]: pgmap v48: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 279 B/s rd, 0 op/s
Jan 23 09:55:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:10.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:10.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:11 compute-1 ceph-mon[80126]: pgmap v49: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:12 compute-1 ceph-mon[80126]: pgmap v50: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:55:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:12.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:12.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:14 compute-1 sudo[89425]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518002360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:14.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:14.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:14 compute-1 ceph-mon[80126]: pgmap v51: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:16.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:16.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:16 compute-1 sudo[91150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:55:16 compute-1 sudo[91150]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:55:16 compute-1 sudo[91150]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518002360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:17 compute-1 ceph-mon[80126]: pgmap v52: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:18 compute-1 sshd-session[91176]: Invalid user sol from 45.148.10.240 port 44746
Jan 23 09:55:18 compute-1 sshd-session[91176]: Connection closed by invalid user sol 45.148.10.240 port 44746 [preauth]
Jan 23 09:55:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:18.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:18.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:19 compute-1 ceph-mon[80126]: pgmap v53: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003070 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:55:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:20.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:20.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:21 compute-1 ceph-mon[80126]: pgmap v54: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:22 compute-1 ceph-mon[80126]: pgmap v55: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 510 B/s rd, 0 op/s
Jan 23 09:55:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003070 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:22.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:22.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:23 compute-1 sudo[91305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqsgdmoaudwmwzbktwkwjexdiygclpmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162122.7753065-365-132858881380668/AnsiballZ_command.py'
Jan 23 09:55:23 compute-1 sudo[91305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:23 compute-1 python3.9[91307]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:55:23 compute-1 sudo[91305]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:24.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:24.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:24 compute-1 ceph-mon[80126]: pgmap v56: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:25 compute-1 sudo[91593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scixpvzmnsizruarfmvpljtgmfyqprib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162124.4865537-389-12552009932159/AnsiballZ_selinux.py'
Jan 23 09:55:25 compute-1 sudo[91593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:25 compute-1 python3.9[91595]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Jan 23 09:55:25 compute-1 sudo[91593]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:26 compute-1 sudo[91746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izgxdbtfieiycyduczsxvxuwfegewedd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162125.9427757-422-62878296867957/AnsiballZ_command.py'
Jan 23 09:55:26 compute-1 sudo[91746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:26 compute-1 python3.9[91748]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Jan 23 09:55:26 compute-1 sudo[91746]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:26.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:26.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:26 compute-1 ceph-mon[80126]: pgmap v57: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ed0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:27 compute-1 sudo[91898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pyxclzerbekshazrhzdjwlmsdgymwclt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162126.78893-446-177558747463654/AnsiballZ_file.py'
Jan 23 09:55:27 compute-1 sudo[91898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:27 compute-1 python3.9[91900]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:55:27 compute-1 sudo[91898]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:27 compute-1 sudo[92051]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbgokvdksqeqfrewkfgtqcswfcbtqssf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162127.466358-470-104086113917241/AnsiballZ_mount.py'
Jan 23 09:55:27 compute-1 sudo[92051]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:28 compute-1 python3.9[92053]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Jan 23 09:55:28 compute-1 sudo[92051]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:28.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:28.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:29 compute-1 sudo[92204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cevneswwymcbooqhrhsqbeucgakjybal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162129.2718496-554-246031162305014/AnsiballZ_file.py'
Jan 23 09:55:29 compute-1 sudo[92204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:29 compute-1 ceph-mon[80126]: pgmap v58: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:29 compute-1 python3.9[92206]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:55:29 compute-1 sudo[92204]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:30 compute-1 ceph-mon[80126]: pgmap v59: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:55:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:30.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:30 compute-1 sudo[92356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljkbacmwkhfsduqmlcsltqxroebuctsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162130.5365987-578-273341475600433/AnsiballZ_stat.py'
Jan 23 09:55:30 compute-1 sudo[92356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:30.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:31 compute-1 python3.9[92358]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:55:31 compute-1 sudo[92356]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:31 compute-1 sudo[92434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qncaimkqfoimnqgtyvdcwtklpauangaa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162130.5365987-578-273341475600433/AnsiballZ_file.py'
Jan 23 09:55:31 compute-1 sudo[92434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:31 compute-1 python3.9[92437]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem _original_basename=tls-ca-bundle.pem recurse=False state=file path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:55:31 compute-1 sudo[92434]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003ef0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:32.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:32 compute-1 sudo[92587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqnakjrhnbmvwsfuopbexoapcjormbfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162132.4152913-641-14794913577247/AnsiballZ_stat.py'
Jan 23 09:55:32 compute-1 sudo[92587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:32.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:32 compute-1 python3.9[92589]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:55:32 compute-1 sudo[92587]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:33 compute-1 ceph-mon[80126]: pgmap v60: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 425 B/s rd, 0 op/s
Jan 23 09:55:33 compute-1 sudo[92742]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-etzakpnqboyrzzbhrvqynjhykzpcjipk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162133.5353584-680-177243112866818/AnsiballZ_getent.py'
Jan 23 09:55:33 compute-1 sudo[92742]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:34 compute-1 python3.9[92744]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Jan 23 09:55:34 compute-1 sudo[92742]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003f10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:34 compute-1 ceph-mon[80126]: pgmap v61: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:55:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:34.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:34 compute-1 sudo[92895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfazdldlusikgujsiwbdqzrharzqwpsd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162134.496984-710-41810647326166/AnsiballZ_getent.py'
Jan 23 09:55:34 compute-1 sudo[92895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:34.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:34 compute-1 python3.9[92897]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Jan 23 09:55:34 compute-1 sudo[92895]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095535 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:55:35 compute-1 sudo[93049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqxgfmkqzcmcsoxqeaqzlkreempwdqtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162135.2208033-734-8234755722208/AnsiballZ_group.py'
Jan 23 09:55:35 compute-1 sudo[93049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:35 compute-1 python3.9[93051]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 09:55:35 compute-1 sudo[93049]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:55:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa52c003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:36 compute-1 sudo[93201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymmdtyxndxlomhzkrrhlgdvdgszvvfmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162136.2658503-761-81187010456303/AnsiballZ_file.py'
Jan 23 09:55:36 compute-1 sudo[93201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:36.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:36 compute-1 python3.9[93203]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Jan 23 09:55:36 compute-1 sudo[93201]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:36.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:37 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:37 compute-1 sudo[93228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:55:37 compute-1 sudo[93228]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:55:37 compute-1 sudo[93228]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:37 compute-1 ceph-mon[80126]: pgmap v62: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:55:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:37 compute-1 sudo[93379]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdnulipionvtfavwsgsiacakrseubmov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162137.33973-794-53926997912665/AnsiballZ_dnf.py'
Jan 23 09:55:37 compute-1 sudo[93379]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:37 compute-1 python3.9[93381]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:55:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:38.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:38.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:39 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:39 compute-1 sudo[93379]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:39 compute-1 ceph-mon[80126]: pgmap v63: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:55:39 compute-1 sshd-session[71326]: Received disconnect from 38.129.56.17 port 49808:11: disconnected by user
Jan 23 09:55:39 compute-1 sshd-session[71326]: Disconnected from user zuul 38.129.56.17 port 49808
Jan 23 09:55:39 compute-1 sshd-session[71323]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:55:39 compute-1 systemd[1]: session-19.scope: Deactivated successfully.
Jan 23 09:55:39 compute-1 systemd[1]: session-19.scope: Consumed 9.124s CPU time.
Jan 23 09:55:39 compute-1 systemd-logind[807]: Session 19 logged out. Waiting for processes to exit.
Jan 23 09:55:39 compute-1 systemd-logind[807]: Removed session 19.
Jan 23 09:55:39 compute-1 sudo[93534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gdkukqvkjevguuljbwlmhqrfodrrvrqu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162139.578849-818-247834790877611/AnsiballZ_file.py'
Jan 23 09:55:39 compute-1 sudo[93534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:40 compute-1 python3.9[93536]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:55:40 compute-1 sudo[93534]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa528003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:40 compute-1 ceph-mon[80126]: pgmap v64: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:55:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:40 compute-1 sudo[93688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dvrdkvkpikaseamykybrlxtxojzcxswu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162140.331898-842-254051255766584/AnsiballZ_stat.py'
Jan 23 09:55:40 compute-1 sudo[93688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:40.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:40 compute-1 python3.9[93690]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:55:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:40.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:40 compute-1 sudo[93688]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:41 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530001e00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:41 compute-1 sudo[93766]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsoewexkzwyevqhthsxzxbkhbzrrkllx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162140.331898-842-254051255766584/AnsiballZ_file.py'
Jan 23 09:55:41 compute-1 sudo[93766]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:41 compute-1 python3.9[93768]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/modules-load.d/99-edpm.conf _original_basename=edpm-modprobe.conf.j2 recurse=False state=file path=/etc/modules-load.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:55:41 compute-1 sudo[93766]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:41 compute-1 sudo[93919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hmdcsfkvkeqxryvsdfkncvkwfcgcvehz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162141.5584354-881-88738312438081/AnsiballZ_stat.py'
Jan 23 09:55:41 compute-1 sudo[93919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:42 compute-1 python3.9[93921]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:55:42 compute-1 sudo[93919]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:42 compute-1 sudo[93997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twzaervpesivzemwpwkgknctfhdejlvf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162141.5584354-881-88738312438081/AnsiballZ_file.py'
Jan 23 09:55:42 compute-1 sudo[93997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:42 compute-1 python3.9[93999]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/sysctl.d/99-edpm.conf _original_basename=edpm-sysctl.conf.j2 recurse=False state=file path=/etc/sysctl.d/99-edpm.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:55:42 compute-1 sudo[93997]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:42.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:42.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:43 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:43 compute-1 sudo[94149]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gwefxnlejmkbgznoqykboirzvhlvlzol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162143.0439227-926-173324118344838/AnsiballZ_dnf.py'
Jan 23 09:55:43 compute-1 sudo[94149]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:43 compute-1 ceph-mon[80126]: pgmap v65: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:55:43 compute-1 python3.9[94151]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:55:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:55:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530001e00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:44.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:44.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:44 compute-1 ceph-mon[80126]: pgmap v66: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:55:44 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Jan 23 09:55:44 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:44.943083) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:55:44 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Jan 23 09:55:44 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162144943241, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2001, "num_deletes": 251, "total_data_size": 8190339, "memory_usage": 8441216, "flush_reason": "Manual Compaction"}
Jan 23 09:55:44 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145009671, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 5055840, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8468, "largest_seqno": 10464, "table_properties": {"data_size": 5047090, "index_size": 5308, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2373, "raw_key_size": 19703, "raw_average_key_size": 20, "raw_value_size": 5028702, "raw_average_value_size": 5332, "num_data_blocks": 236, "num_entries": 943, "num_filter_entries": 943, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162058, "oldest_key_time": 1769162058, "file_creation_time": 1769162144, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 66615 microseconds, and 14380 cpu microseconds.
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:55:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:45 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.009764) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 5055840 bytes OK
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.009807) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.057969) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.058082) EVENT_LOG_v1 {"time_micros": 1769162145058055, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.058115) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 8180689, prev total WAL file size 8180689, number of live WAL files 2.
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.062426) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300323531' seq:72057594037927935, type:22 .. '7061786F7300353033' seq:0, type:0; will stop at (end)
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4937KB)], [18(11MB)]
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145062575, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 17497009, "oldest_snapshot_seqno": -1}
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 4121 keys, 13782054 bytes, temperature: kUnknown
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145176562, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 13782054, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13748468, "index_size": 22212, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10309, "raw_key_size": 105010, "raw_average_key_size": 25, "raw_value_size": 13666938, "raw_average_value_size": 3316, "num_data_blocks": 954, "num_entries": 4121, "num_filter_entries": 4121, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162145, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.176947) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 13782054 bytes
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.180383) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 153.3 rd, 120.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.8, 11.9 +0.0 blob) out(13.1 +0.0 blob), read-write-amplify(6.2) write-amplify(2.7) OK, records in: 4655, records dropped: 534 output_compression: NoCompression
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.180417) EVENT_LOG_v1 {"time_micros": 1769162145180402, "job": 8, "event": "compaction_finished", "compaction_time_micros": 114164, "compaction_time_cpu_micros": 36470, "output_level": 6, "num_output_files": 1, "total_output_size": 13782054, "num_input_records": 4655, "num_output_records": 4121, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145181425, "job": 8, "event": "table_file_deletion", "file_number": 20}
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162145183481, "job": 8, "event": "table_file_deletion", "file_number": 18}
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.062283) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183616) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183620) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183623) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:55:45.183624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:55:45 compute-1 sudo[94149]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:46 compute-1 python3.9[94304]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:55:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:46 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530002b50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:46.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:46.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:46 compute-1 ceph-mon[80126]: pgmap v67: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:55:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:47 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:47 compute-1 python3.9[94456]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Jan 23 09:55:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:47 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:55:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:47 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:55:48 compute-1 python3.9[94607]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:55:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:48 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:48.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:48.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:48 compute-1 ceph-mon[80126]: pgmap v68: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:55:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:49 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530002b50 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:49 compute-1 sudo[94758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eigwtzgddedstvbwszlizbaalddutakv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162148.7898252-1049-27893227964195/AnsiballZ_systemd.py'
Jan 23 09:55:49 compute-1 sudo[94758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:49 compute-1 python3.9[94760]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:55:49 compute-1 systemd[1]: Stopping Dynamic System Tuning Daemon...
Jan 23 09:55:49 compute-1 systemd[1]: tuned.service: Deactivated successfully.
Jan 23 09:55:49 compute-1 systemd[1]: Stopped Dynamic System Tuning Daemon.
Jan 23 09:55:49 compute-1 systemd[1]: Starting Dynamic System Tuning Daemon...
Jan 23 09:55:50 compute-1 systemd[1]: Started Dynamic System Tuning Daemon.
Jan 23 09:55:50 compute-1 sudo[94758]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:50.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:50.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:50 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:55:50 compute-1 ceph-mon[80126]: pgmap v69: 353 pgs: 353 active+clean; 457 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:55:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:55:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:51 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:51 compute-1 python3.9[94921]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Jan 23 09:55:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530003860 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:52 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:52.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:52.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:53 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:53 compute-1 ceph-mon[80126]: pgmap v70: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Jan 23 09:55:53 compute-1 ceph-mon[80126]: mgrmap e32: compute-0.nbdygh(active, since 92s), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 09:55:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:54 compute-1 ceph-mon[80126]: pgmap v71: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 1.2 KiB/s wr, 2 op/s
Jan 23 09:55:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:54 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530003860 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:54.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:54.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:55 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:55 compute-1 sudo[95074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-akivxisonlsiapdcwquadlfpsbkhznhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162155.1921592-1220-223456355663116/AnsiballZ_systemd.py'
Jan 23 09:55:55 compute-1 sudo[95074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095555 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:55:55 compute-1 python3.9[95076]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:55:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:56 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:56.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:56 compute-1 sudo[95074]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:56 compute-1 ceph-mon[80126]: pgmap v72: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 1.2 KiB/s wr, 2 op/s
Jan 23 09:55:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:56.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:57 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:57 compute-1 sudo[95149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:55:57 compute-1 sudo[95149]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:55:57 compute-1 sudo[95149]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:57 compute-1 sudo[95254]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awddabjbijuljsomjxruisuazdnybyyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162156.9911535-1220-26029770964919/AnsiballZ_systemd.py'
Jan 23 09:55:57 compute-1 sudo[95254]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:55:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:55:57 compute-1 python3.9[95256]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:55:57 compute-1 sudo[95254]: pam_unix(sudo:session): session closed for user root
Jan 23 09:55:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:58 compute-1 sshd-session[86310]: Connection closed by 192.168.122.30 port 59612
Jan 23 09:55:58 compute-1 sshd-session[86307]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:55:58 compute-1 systemd[1]: session-37.scope: Deactivated successfully.
Jan 23 09:55:58 compute-1 systemd[1]: session-37.scope: Consumed 1min 7.362s CPU time.
Jan 23 09:55:58 compute-1 systemd-logind[807]: Session 37 logged out. Waiting for processes to exit.
Jan 23 09:55:58 compute-1 systemd-logind[807]: Removed session 37.
Jan 23 09:55:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:58 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:55:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:55:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:55:58.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:55:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:55:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:55:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:55:58.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:55:58 compute-1 ceph-mon[80126]: pgmap v73: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 1.2 KiB/s wr, 3 op/s
Jan 23 09:55:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:55:59 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:00 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:56:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:00.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:56:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:00.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:00 compute-1 ceph-mon[80126]: pgmap v74: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 09:56:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:01 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:02 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:02.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:02.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:03 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:03 compute-1 sudo[95285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:56:03 compute-1 sudo[95285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:03 compute-1 sudo[95285]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:03 compute-1 sudo[95310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:56:03 compute-1 sudo[95310]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:03 compute-1 ceph-mon[80126]: pgmap v75: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 09:56:03 compute-1 sudo[95310]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:04 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:56:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:04.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:56:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:56:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:04.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:56:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:05 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:05 compute-1 ceph-mon[80126]: pgmap v76: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:56:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:56:06 compute-1 sshd-session[95367]: Accepted publickey for zuul from 192.168.122.30 port 46764 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:56:06 compute-1 systemd-logind[807]: New session 39 of user zuul.
Jan 23 09:56:06 compute-1 systemd[1]: Started Session 39 of User zuul.
Jan 23 09:56:06 compute-1 sshd-session[95367]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:56:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:06 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:06.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:06 compute-1 ceph-mon[80126]: pgmap v77: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:56:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:06.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:07 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:07 compute-1 python3.9[95520]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:08 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:08.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:56:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:08.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:56:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:09 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:09 compute-1 sudo[95676]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbgbjkdqybywndjymbqjnclminbrsbhu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162168.9956803-64-280309541757142/AnsiballZ_getent.py'
Jan 23 09:56:09 compute-1 sudo[95676]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:09 compute-1 python3.9[95678]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Jan 23 09:56:09 compute-1 sudo[95676]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:09 compute-1 ceph-mon[80126]: pgmap v78: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:56:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa51c003eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:10 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:56:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:10.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:56:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:56:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:10.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:56:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:11 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:11 compute-1 sudo[95829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssbniepizkizykqhcwbddhdpewgbvjzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162170.7550015-100-95021229619376/AnsiballZ_setup.py'
Jan 23 09:56:11 compute-1 sudo[95829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:11 compute-1 python3.9[95831]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:56:11 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:11 compute-1 ceph-mon[80126]: pgmap v79: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:11 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:11 compute-1 sudo[95829]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:12 compute-1 sudo[95914]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vbildbyudhtonucizbbghwfvqvmsbsjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162170.7550015-100-95021229619376/AnsiballZ_dnf.py'
Jan 23 09:56:12 compute-1 sudo[95914]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:12 compute-1 python3.9[95916]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 09:56:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:12 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:56:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:12.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:56:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:56:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:56:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:12 compute-1 ceph-mon[80126]: pgmap v80: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:56:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:56:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:56:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:56:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:12.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:13 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280014c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:13 compute-1 sudo[95914]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:14 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:14.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:14 compute-1 sudo[96070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jmwtmklhfrhsvjkdsetpurihunlfcezm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162174.4460573-142-188489395665390/AnsiballZ_dnf.py'
Jan 23 09:56:14 compute-1 sudo[96070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:14.941 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:15 compute-1 python3.9[96072]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:15 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:15 compute-1 ceph-mon[80126]: pgmap v81: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280014c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:16 compute-1 sudo[96070]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:16 compute-1 ceph-mon[80126]: pgmap v82: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:16 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:16.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:16.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:17 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:17 compute-1 sudo[96151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:56:17 compute-1 sudo[96151]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:17 compute-1 sudo[96151]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:17 compute-1 sudo[96250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pqthjcthvqwpogxxouykmsfqsuabcyuj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162176.742079-166-167218203436434/AnsiballZ_systemd.py'
Jan 23 09:56:17 compute-1 sudo[96250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:17 compute-1 python3.9[96252]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 09:56:17 compute-1 sudo[96250]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:18 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280014c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:18 compute-1 python3.9[96405]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:18.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:18.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:19 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:19 compute-1 sudo[96556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yhrolvdohesozaouzsbimaepyfrmjxio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162179.0681198-220-229120141850644/AnsiballZ_sefcontext.py'
Jan 23 09:56:19 compute-1 sudo[96556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:19 compute-1 ceph-mon[80126]: pgmap v83: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 09:56:19 compute-1 python3.9[96558]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Jan 23 09:56:20 compute-1 sudo[96556]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:20 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:20.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:20 compute-1 python3.9[96708]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:20.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:21 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280014c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095621 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:56:21 compute-1 sudo[96865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qdouuszjrxipchxzbcrvvrmabfbztrvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162181.4665859-274-280392654533991/AnsiballZ_dnf.py'
Jan 23 09:56:21 compute-1 sudo[96865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:21 compute-1 ceph-mon[80126]: pgmap v84: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:56:22 compute-1 python3.9[96867]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:22 compute-1 sudo[96869]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:56:22 compute-1 sudo[96869]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:22 compute-1 sudo[96869]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:22 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:22.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 09:56:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:22.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 09:56:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:23 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:23 compute-1 sudo[96865]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:23 compute-1 ceph-mon[80126]: pgmap v85: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:56:23 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:23 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:56:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:24 compute-1 sudo[97044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzghacidvplhyysnqcdqddalhuxwrqcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162183.8819113-298-37923812056658/AnsiballZ_command.py'
Jan 23 09:56:24 compute-1 sudo[97044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:24 compute-1 python3.9[97046]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:56:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:24 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:24.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:24.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:25 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:25 compute-1 sudo[97044]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:25 compute-1 ceph-mon[80126]: pgmap v86: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:25 compute-1 sudo[97332]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ijniqwhsnidivvsduqpifppkaoubwylo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162185.5093741-322-66350034373027/AnsiballZ_file.py'
Jan 23 09:56:25 compute-1 sudo[97332]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:26 compute-1 python3.9[97334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Jan 23 09:56:26 compute-1 sudo[97332]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:26 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:26.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:26.959 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:27 compute-1 python3.9[97484]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:56:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:27 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:27 compute-1 ceph-mon[80126]: pgmap v87: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:27 compute-1 sudo[97637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emuoqqvnlewvcnduayjezkqimvuhoehh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162187.2671719-370-21616427380042/AnsiballZ_dnf.py'
Jan 23 09:56:27 compute-1 sudo[97637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:27 compute-1 python3.9[97639]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:28 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:28.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:28.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:29 compute-1 ceph-mon[80126]: pgmap v88: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:56:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:29 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:29 compute-1 sudo[97637]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:30 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:30.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:30 compute-1 sudo[97791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shtecdgaaozjeswaubfpixerfotyovlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162190.473407-397-96691916813107/AnsiballZ_dnf.py'
Jan 23 09:56:30 compute-1 sudo[97791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:30.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:31 compute-1 python3.9[97793]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:31 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:31 compute-1 ceph-mon[80126]: pgmap v89: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:56:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:32 compute-1 sudo[97791]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:32 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:32.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 09:56:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:32.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 09:56:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:33 compute-1 ceph-mon[80126]: pgmap v90: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:33 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:56:33 compute-1 sudo[97946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mahcrrrvlgefbsdmgopdnktmlablyqhl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162193.382954-433-96469756732117/AnsiballZ_stat.py'
Jan 23 09:56:33 compute-1 sudo[97946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:33 compute-1 python3.9[97948]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:56:33 compute-1 sudo[97946]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:34 compute-1 sudo[98100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sfowwcxfqxtcqqnfvyrdxyzbyiymlqwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162194.1578796-457-259289536006216/AnsiballZ_slurp.py'
Jan 23 09:56:34 compute-1 sudo[98100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:34 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:34.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:34 compute-1 python3.9[98102]: ansible-ansible.builtin.slurp Invoked with path=/var/lib/edpm-config/os-net-config.returncode src=/var/lib/edpm-config/os-net-config.returncode
Jan 23 09:56:34 compute-1 sudo[98100]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:34.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:35 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:35 compute-1 ceph-mon[80126]: pgmap v91: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:56:35 compute-1 sshd-session[95370]: Connection closed by 192.168.122.30 port 46764
Jan 23 09:56:35 compute-1 sshd-session[95367]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:56:35 compute-1 systemd[1]: session-39.scope: Deactivated successfully.
Jan 23 09:56:35 compute-1 systemd[1]: session-39.scope: Consumed 18.813s CPU time.
Jan 23 09:56:35 compute-1 systemd-logind[807]: Session 39 logged out. Waiting for processes to exit.
Jan 23 09:56:35 compute-1 systemd-logind[807]: Removed session 39.
Jan 23 09:56:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:56:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:36 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:56:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:36.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:36.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:37 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:37 compute-1 ceph-mon[80126]: pgmap v92: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:37 compute-1 sudo[98129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:56:37 compute-1 sudo[98129]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:37 compute-1 sudo[98129]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:38 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:38.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 09:56:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:39.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 09:56:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:39 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:39 compute-1 ceph-mon[80126]: pgmap v93: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:56:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:39 : epoch 6973453e : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:56:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:40 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510003240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 09:56:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:40.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 09:56:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:41.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:41 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:41 compute-1 ceph-mon[80126]: pgmap v94: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:56:41 compute-1 sshd-session[98156]: Accepted publickey for zuul from 192.168.122.30 port 46394 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:56:41 compute-1 systemd-logind[807]: New session 40 of user zuul.
Jan 23 09:56:41 compute-1 systemd[1]: Started Session 40 of User zuul.
Jan 23 09:56:41 compute-1 sshd-session[98156]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:56:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa518003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:42 compute-1 python3.9[98310]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:42 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa5280032a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:42.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095642 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:56:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:43.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:43 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa510004340 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:56:43 compute-1 ceph-mon[80126]: pgmap v95: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:56:43 compute-1 python3.9[98464]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:56:44 compute-1 kernel: ganesha.nfsd[91121]: segfault at 50 ip 00007fa5c06ee32e sp 00007fa53f7fd210 error 4 in libntirpc.so.5.8[7fa5c06d3000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 23 09:56:44 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 09:56:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[86262]: 23/01/2026 09:56:44 : epoch 6973453e : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fa530004570 fd 42 proxy ignored for local
Jan 23 09:56:44 compute-1 systemd[1]: Started Process Core Dump (PID 98558/UID 0).
Jan 23 09:56:44 compute-1 ceph-mon[80126]: pgmap v96: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 09:56:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:44.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:44 compute-1 python3.9[98662]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:56:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:45.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:45 compute-1 sshd-session[98160]: Connection closed by 192.168.122.30 port 46394
Jan 23 09:56:45 compute-1 sshd-session[98156]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:56:45 compute-1 systemd[1]: session-40.scope: Deactivated successfully.
Jan 23 09:56:45 compute-1 systemd[1]: session-40.scope: Consumed 2.459s CPU time.
Jan 23 09:56:45 compute-1 systemd-logind[807]: Session 40 logged out. Waiting for processes to exit.
Jan 23 09:56:45 compute-1 systemd-logind[807]: Removed session 40.
Jan 23 09:56:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095645 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:56:45 compute-1 systemd-coredump[98566]: Process 86266 (ganesha.nfsd) of user 0 dumped core.
                                                   
                                                   Stack trace of thread 59:
                                                   #0  0x00007fa5c06ee32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                   ELF object binary architecture: AMD x86-64
Jan 23 09:56:45 compute-1 systemd[1]: systemd-coredump@1-98558-0.service: Deactivated successfully.
Jan 23 09:56:45 compute-1 systemd[1]: systemd-coredump@1-98558-0.service: Consumed 1.308s CPU time.
Jan 23 09:56:45 compute-1 podman[98694]: 2026-01-23 09:56:45.87651893 +0000 UTC m=+0.045238319 container died 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1)
Jan 23 09:56:45 compute-1 systemd[1]: var-lib-containers-storage-overlay-41d7b65a540631e866272710eff73efc2cc1c8799fe6e4ef514b17e880ffe73d-merged.mount: Deactivated successfully.
Jan 23 09:56:45 compute-1 systemd[82140]: Created slice User Background Tasks Slice.
Jan 23 09:56:45 compute-1 systemd[82140]: Starting Cleanup of User's Temporary Files and Directories...
Jan 23 09:56:45 compute-1 systemd[82140]: Finished Cleanup of User's Temporary Files and Directories.
Jan 23 09:56:46 compute-1 podman[98694]: 2026-01-23 09:56:46.053864249 +0000 UTC m=+0.222583608 container remove 130d8f7b0099e6cdc8025f8712dccd2b289451a574e89e02220980eeb9cd81fd (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:56:46 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 09:56:46 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 09:56:46 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.185s CPU time.
Jan 23 09:56:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:46.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:46 compute-1 ceph-mon[80126]: pgmap v97: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 09:56:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:47.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:48.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:49 compute-1 ceph-mon[80126]: pgmap v98: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 09:56:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:49.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095650 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:56:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:56:50 compute-1 sshd-session[98743]: Accepted publickey for zuul from 192.168.122.30 port 44562 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:56:50 compute-1 systemd-logind[807]: New session 41 of user zuul.
Jan 23 09:56:50 compute-1 systemd[1]: Started Session 41 of User zuul.
Jan 23 09:56:50 compute-1 sshd-session[98743]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:56:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:50.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 09:56:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:51.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 09:56:51 compute-1 ceph-mon[80126]: pgmap v99: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 09:56:51 compute-1 python3.9[98897]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:52 compute-1 python3.9[99051]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:56:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 09:56:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:52.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 09:56:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:53.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:53 compute-1 ceph-mon[80126]: pgmap v100: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 511 B/s wr, 2 op/s
Jan 23 09:56:53 compute-1 sudo[99206]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syaskxwdkqkkvravhzxzvaxrhfektora ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162213.1251583-76-35511429425402/AnsiballZ_setup.py'
Jan 23 09:56:53 compute-1 sudo[99206]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:53 compute-1 python3.9[99208]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:56:53 compute-1 sudo[99206]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:54 compute-1 sudo[99290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpgnrrbiskgwkesuwwjnrmcnneddyaro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162213.1251583-76-35511429425402/AnsiballZ_dnf.py'
Jan 23 09:56:54 compute-1 sudo[99290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:54 compute-1 python3.9[99292]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:56:54 compute-1 ceph-mon[80126]: pgmap v101: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:54.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:55.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:55 compute-1 sudo[99290]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:56 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 2.
Jan 23 09:56:56 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:56:56 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.185s CPU time.
Jan 23 09:56:56 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:56:56 compute-1 podman[99425]: 2026-01-23 09:56:56.528141399 +0000 UTC m=+0.025387734 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:56:56 compute-1 sudo[99503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsnegbuouoptqzeerxjxatbbslctuwfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162216.3928547-112-34985861000023/AnsiballZ_setup.py'
Jan 23 09:56:56 compute-1 sudo[99503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:56 compute-1 podman[99425]: 2026-01-23 09:56:56.732234008 +0000 UTC m=+0.229480323 container create 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:56:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:56.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 09:56:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:56:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:56:56 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:56:57 compute-1 podman[99425]: 2026-01-23 09:56:57.004292627 +0000 UTC m=+0.501538972 container init 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS)
Jan 23 09:56:57 compute-1 podman[99425]: 2026-01-23 09:56:57.010085234 +0000 UTC m=+0.507331549 container start 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 09:56:57 compute-1 python3.9[99505]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:56:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 09:56:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 09:56:57 compute-1 bash[99425]: 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743
Jan 23 09:56:57 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:56:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:57.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 09:56:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 09:56:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 09:56:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 09:56:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 09:56:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:56:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:56:57 compute-1 sudo[99589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:56:57 compute-1 sudo[99589]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:56:57 compute-1 sudo[99589]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:57 compute-1 sudo[99503]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:57 compute-1 ceph-mon[80126]: pgmap v102: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:56:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:56:58 compute-1 sudo[99768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-garzdshtjuqlrsrodatyflhvrojgchwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162217.866267-145-5923289704474/AnsiballZ_file.py'
Jan 23 09:56:58 compute-1 sudo[99768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:58 compute-1 python3.9[99770]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:56:58 compute-1 sudo[99768]: pam_unix(sudo:session): session closed for user root
Jan 23 09:56:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:56:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:56:58.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:56:58 compute-1 ceph-mon[80126]: pgmap v103: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 426 B/s wr, 1 op/s
Jan 23 09:56:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:56:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:56:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:56:59.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:56:59 compute-1 sudo[99921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-osvreowozdblffztuyjmxdufcjpwbdrr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162219.227133-169-58983514492347/AnsiballZ_command.py'
Jan 23 09:56:59 compute-1 sudo[99921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:56:59 compute-1 python3.9[99923]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:57:00 compute-1 sudo[99921]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:00 compute-1 sudo[100087]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxhpzmfucznrjkykazdyayozwhzbzopx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162220.184361-193-72460455855274/AnsiballZ_stat.py'
Jan 23 09:57:00 compute-1 sudo[100087]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:00 compute-1 python3.9[100089]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:57:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:00.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:57:00 compute-1 sudo[100087]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:01.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:01 compute-1 sudo[100165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rlqxebnqumsknxscmxtmjapoegalnysj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162220.184361-193-72460455855274/AnsiballZ_file.py'
Jan 23 09:57:01 compute-1 sudo[100165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:01 compute-1 python3.9[100167]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:01 compute-1 sudo[100165]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:01 compute-1 ceph-mon[80126]: pgmap v104: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Jan 23 09:57:02 compute-1 sudo[100318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syefpoifuipzzavxbwzkjntrfwayagke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162221.7835197-229-166759353299360/AnsiballZ_stat.py'
Jan 23 09:57:02 compute-1 sudo[100318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:02 compute-1 python3.9[100320]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:02 compute-1 sudo[100318]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:02 compute-1 sudo[100396]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pofjlvdazdrvnptbjrjhzutvggrbdrwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162221.7835197-229-166759353299360/AnsiballZ_file.py'
Jan 23 09:57:02 compute-1 sudo[100396]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:02 compute-1 python3.9[100398]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:02 compute-1 sudo[100396]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:57:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:02.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:57:02 compute-1 ceph-mon[80126]: pgmap v105: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 767 B/s wr, 3 op/s
Jan 23 09:57:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:03.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:03 compute-1 sudo[100549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izfhwycsyuufjmyttwuooqozizoiysfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162222.9871864-268-87689602411752/AnsiballZ_ini_file.py'
Jan 23 09:57:03 compute-1 sudo[100549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:03 compute-1 python3.9[100551]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:03 compute-1 sudo[100549]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:04 compute-1 sudo[100701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djzfydsxahohwrhauydkqpmabiedltlz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162223.8148634-268-210717932382941/AnsiballZ_ini_file.py'
Jan 23 09:57:04 compute-1 sudo[100701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:04 compute-1 python3.9[100703]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:04 compute-1 sudo[100701]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] rados_kv_traverse :CLIENT ID :EVENT :Failed to lst kv ret=-2
Jan 23 09:57:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_read_clids :CLIENT ID :EVENT :Failed to traverse recovery db: -2
Jan 23 09:57:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:57:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:57:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 09:57:04 compute-1 sudo[100853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-luccrqsejupsxskysmfmvkvsqubhidvh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162224.4321768-268-8542935402290/AnsiballZ_ini_file.py'
Jan 23 09:57:04 compute-1 sudo[100853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:57:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:57:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:04 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:57:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:04.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:04 compute-1 python3.9[100855]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:04 compute-1 sudo[100853]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:05.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:05 compute-1 ceph-mon[80126]: pgmap v106: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 09:57:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:57:06 compute-1 sudo[101006]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zagcrzoopsbdjsvcetgjnlydmkgbmxtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162225.073868-268-54265364110861/AnsiballZ_ini_file.py'
Jan 23 09:57:06 compute-1 sudo[101006]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:06 compute-1 ceph-mon[80126]: pgmap v107: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 09:57:06 compute-1 python3.9[101008]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 09:57:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:06.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 09:57:06 compute-1 sudo[101006]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095707 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:57:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 09:57:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:07.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 09:57:07 compute-1 sudo[101159]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oamsyyjfdeybaacainwkxeihgieugbgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162227.130305-361-244104347939589/AnsiballZ_dnf.py'
Jan 23 09:57:07 compute-1 sudo[101159]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:07 compute-1 python3.9[101161]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:57:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:08.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:09 compute-1 sudo[101159]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:09.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:09 compute-1 ceph-mon[80126]: pgmap v108: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.7 KiB/s rd, 1.5 KiB/s wr, 5 op/s
Jan 23 09:57:10 compute-1 sudo[101313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqsiucrncxxdmzulcdtigidrbnmvhxym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162229.894827-394-175605297795766/AnsiballZ_setup.py'
Jan 23 09:57:10 compute-1 sudo[101313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:10 compute-1 python3.9[101315]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:57:10 compute-1 sudo[101313]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] rados_cluster_end_grace :CLIENT ID :EVENT :Failed to remove rec-000000000000000a:nfs.cephfs.0: -2
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 09:57:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:10 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:57:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:57:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:10.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:57:11 compute-1 sudo[101479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ehrwduvkzrewtuzdnleobuhvwbrmrgvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162230.7099993-418-53248648654862/AnsiballZ_stat.py'
Jan 23 09:57:11 compute-1 sudo[101479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:11.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:11 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:11 compute-1 ceph-mon[80126]: pgmap v109: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.3 KiB/s rd, 1.2 KiB/s wr, 4 op/s
Jan 23 09:57:11 compute-1 python3.9[101481]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:57:11 compute-1 sudo[101479]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:11 compute-1 sudo[101636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ztuehlfldboceylrxejlfiumwkwskkgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162231.487862-445-203753455501966/AnsiballZ_stat.py'
Jan 23 09:57:11 compute-1 sudo[101636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:11 compute-1 python3.9[101638]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:57:11 compute-1 sudo[101636]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:12 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:12 compute-1 sudo[101788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-utnvxlxelzcqyhwneorvvxixxrgacjjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162232.3192623-475-88723169375725/AnsiballZ_command.py'
Jan 23 09:57:12 compute-1 sudo[101788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:12 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40016e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:12 compute-1 python3.9[101790]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:57:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:57:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:12.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:57:12 compute-1 sudo[101788]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:13.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:13 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:13 compute-1 ceph-mon[80126]: pgmap v110: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 3.9 KiB/s rd, 1.3 KiB/s wr, 5 op/s
Jan 23 09:57:13 compute-1 sudo[101942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uivswmswvmorpbpsqclppslqevkoxudd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162233.1867347-505-267612810791394/AnsiballZ_service_facts.py'
Jan 23 09:57:13 compute-1 sudo[101942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:13 compute-1 python3.9[101944]: ansible-service_facts Invoked
Jan 23 09:57:14 compute-1 network[101961]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:57:14 compute-1 network[101962]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:57:14 compute-1 network[101963]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:57:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095714 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:57:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:14 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:14 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:57:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:14.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:57:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:57:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:15.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:57:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:15 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:15 compute-1 ceph-mon[80126]: pgmap v111: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:57:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:16 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:16 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000034s ======
Jan 23 09:57:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:16.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000034s
Jan 23 09:57:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000035s ======
Jan 23 09:57:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:17.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000035s
Jan 23 09:57:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:17 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:17 compute-1 sudo[102058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:57:17 compute-1 sudo[102058]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:17 compute-1 sudo[102058]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:17 compute-1 ceph-mon[80126]: pgmap v112: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:57:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:17 compute-1 sudo[101942]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:18 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:18 compute-1 ceph-mon[80126]: pgmap v113: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:57:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:18 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:18.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:19.078 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:19 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:19 compute-1 sudo[102274]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjikyleiuyaghsuiuafpucfvxonyeaaw ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769162239.3177457-550-80289426177369/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769162239.3177457-550-80289426177369/args'
Jan 23 09:57:19 compute-1 sudo[102274]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:19 compute-1 sudo[102274]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:20 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:20 compute-1 sudo[102441]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opcpdluykusvmnnilqqfepfrhjnlmwwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162240.0964887-583-256086141825786/AnsiballZ_dnf.py'
Jan 23 09:57:20 compute-1 sudo[102441]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:20 compute-1 python3.9[102443]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 09:57:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:20 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:20.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:20 compute-1 ceph-mon[80126]: pgmap v114: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 170 B/s wr, 0 op/s
Jan 23 09:57:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:57:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:21.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:21 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:21 compute-1 sudo[102441]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:22 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:22 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:22 compute-1 sudo[102470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:57:22 compute-1 sudo[102470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:22 compute-1 sudo[102470]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:22 compute-1 sudo[102495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:57:22 compute-1 sudo[102495]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:22.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:23.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:23 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:23 compute-1 ceph-mon[80126]: pgmap v115: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 170 B/s wr, 1 op/s
Jan 23 09:57:23 compute-1 sudo[102495]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:23 compute-1 sudo[102678]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmfqqjmloignaihnmsksntdpbupykjgh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162242.8652024-622-74372234752196/AnsiballZ_package_facts.py'
Jan 23 09:57:23 compute-1 sudo[102678]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:23 compute-1 python3.9[102680]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Jan 23 09:57:24 compute-1 sudo[102678]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:57:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:57:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:57:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:57:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:57:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:57:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:57:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:24 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:24 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:24.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:25.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:25 compute-1 sudo[102830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzcrgnpcfirkjbfjktzsntghizfkiact ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162244.8215148-653-62420813808441/AnsiballZ_stat.py'
Jan 23 09:57:25 compute-1 sudo[102830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:25 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:25 compute-1 python3.9[102832]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:25 compute-1 sudo[102830]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:25 compute-1 ceph-mon[80126]: pgmap v116: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:25 compute-1 sudo[102909]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wiiokccvcgguikpaemwrpfeqhbkvzorc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162244.8215148-653-62420813808441/AnsiballZ_file.py'
Jan 23 09:57:25 compute-1 sudo[102909]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:25 compute-1 python3.9[102911]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:25 compute-1 sudo[102909]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:26 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:26 compute-1 sudo[103061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cyfnrsnzdcswwjiwxxlimerrombxkgwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162246.1246595-689-136202713661475/AnsiballZ_stat.py'
Jan 23 09:57:26 compute-1 sudo[103061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:26 compute-1 python3.9[103063]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:26 compute-1 sudo[103061]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:26 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:26 compute-1 sudo[103139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iwgcygqmkgppctsurugyvuhubvtvksii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162246.1246595-689-136202713661475/AnsiballZ_file.py'
Jan 23 09:57:26 compute-1 sudo[103139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:57:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:26.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:57:27 compute-1 python3.9[103141]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/chronyd _original_basename=chronyd.sysconfig.j2 recurse=False state=file path=/etc/sysconfig/chronyd force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:27 compute-1 sudo[103139]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:57:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:27.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:57:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:27 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:27 compute-1 ceph-mon[80126]: pgmap v117: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:28 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:28 compute-1 ceph-mon[80126]: pgmap v118: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:28 compute-1 sudo[103292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pgmtzerhxzrgoufhwfzubwsmcwsdbcgl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162248.190121-743-228576155602760/AnsiballZ_lineinfile.py'
Jan 23 09:57:28 compute-1 sudo[103292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:28 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:28 compute-1 python3.9[103294]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:28 compute-1 sudo[103292]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:57:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:28.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:57:29 compute-1 sudo[103319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:57:29 compute-1 sudo[103319]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:29 compute-1 sudo[103319]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:29.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:29 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:30 compute-1 sudo[103470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lyxpknglzqksrizwpygvjtdtgogzdcbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162249.9108286-787-101787685170813/AnsiballZ_setup.py'
Jan 23 09:57:30 compute-1 sudo[103470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:57:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:57:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:30 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:30 compute-1 python3.9[103472]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:57:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:30 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:30 compute-1 sudo[103470]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:30.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:31.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:31 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:31 compute-1 sudo[103554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enolzodwlclwpddevvxopxebamkokyfc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162249.9108286-787-101787685170813/AnsiballZ_systemd.py'
Jan 23 09:57:31 compute-1 sudo[103554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:31 compute-1 ceph-mon[80126]: pgmap v119: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:31 compute-1 python3.9[103556]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:57:31 compute-1 sudo[103554]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:32 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:32 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:32 compute-1 sshd-session[98746]: Connection closed by 192.168.122.30 port 44562
Jan 23 09:57:32 compute-1 sshd-session[98743]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:57:32 compute-1 systemd-logind[807]: Session 41 logged out. Waiting for processes to exit.
Jan 23 09:57:32 compute-1 systemd[1]: session-41.scope: Deactivated successfully.
Jan 23 09:57:32 compute-1 systemd[1]: session-41.scope: Consumed 24.394s CPU time.
Jan 23 09:57:32 compute-1 systemd-logind[807]: Removed session 41.
Jan 23 09:57:32 compute-1 ceph-mon[80126]: pgmap v120: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:57:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:32.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:33.101 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:33 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:34 compute-1 sshd-session[103586]: Invalid user sol from 45.148.10.240 port 54740
Jan 23 09:57:34 compute-1 sshd-session[103586]: Connection closed by invalid user sol 45.148.10.240 port 54740 [preauth]
Jan 23 09:57:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:34 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:34 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:34 compute-1 ceph-mon[80126]: pgmap v121: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:34.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:35.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:35 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:57:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:36 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:36 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:36 compute-1 ceph-mon[80126]: pgmap v122: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:36.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:37.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:37 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:37 compute-1 sudo[103590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:57:37 compute-1 sudo[103590]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:37 compute-1 sudo[103590]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:38 compute-1 sshd-session[103615]: Accepted publickey for zuul from 192.168.122.30 port 35976 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:57:38 compute-1 systemd-logind[807]: New session 42 of user zuul.
Jan 23 09:57:38 compute-1 systemd[1]: Started Session 42 of User zuul.
Jan 23 09:57:38 compute-1 sshd-session[103615]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:57:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:38 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:38 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:38 compute-1 sudo[103768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnnkuvieapttdjdkbxqxxcyrqolwmbpm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162258.282903-22-60417266295891/AnsiballZ_file.py'
Jan 23 09:57:38 compute-1 sudo[103768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:38.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:38 compute-1 ceph-mon[80126]: pgmap v123: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:39 compute-1 python3.9[103770]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:39 compute-1 sudo[103768]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:39.114 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:39 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:39 compute-1 sudo[103921]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gptqtavjnxyywtuxijahtbrlyuxsutor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162259.238858-58-245958543590571/AnsiballZ_stat.py'
Jan 23 09:57:39 compute-1 sudo[103921]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:39 compute-1 python3.9[103923]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/ceph-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:39 compute-1 sudo[103921]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:40 compute-1 sudo[103999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnssqnrdlxuqhybwcjztvrxhgxoscxbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162259.238858-58-245958543590571/AnsiballZ_file.py'
Jan 23 09:57:40 compute-1 sudo[103999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:40 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:40 compute-1 python3.9[104001]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/ceph-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/ceph-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:40 compute-1 sudo[103999]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:40 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd0004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:40 compute-1 sshd-session[103618]: Connection closed by 192.168.122.30 port 35976
Jan 23 09:57:40 compute-1 sshd-session[103615]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:57:40 compute-1 systemd[1]: session-42.scope: Deactivated successfully.
Jan 23 09:57:40 compute-1 systemd[1]: session-42.scope: Consumed 1.639s CPU time.
Jan 23 09:57:40 compute-1 systemd-logind[807]: Session 42 logged out. Waiting for processes to exit.
Jan 23 09:57:40 compute-1 systemd-logind[807]: Removed session 42.
Jan 23 09:57:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:40.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:41 compute-1 ceph-mon[80126]: pgmap v124: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:41.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:41 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcd4003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:42 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:42 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:42.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:43.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:43 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:44 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf80013a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:44 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:44.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:45.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:45 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:45 compute-1 ceph-mon[80126]: pgmap v125: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:57:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:46 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:46 compute-1 ceph-mon[80126]: pgmap v126: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:46 compute-1 ceph-mon[80126]: pgmap v127: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:46 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf8001eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:46.930 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:47.130 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:47 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:47 compute-1 sshd-session[104033]: Accepted publickey for zuul from 192.168.122.30 port 43192 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:57:47 compute-1 systemd-logind[807]: New session 43 of user zuul.
Jan 23 09:57:47 compute-1 systemd[1]: Started Session 43 of User zuul.
Jan 23 09:57:47 compute-1 sshd-session[104033]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:57:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:48 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:48 compute-1 python3.9[104186]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:57:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:48 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:48.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:49.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:49 compute-1 ceph-mon[80126]: pgmap v128: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:49 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf8001eb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:49 compute-1 sudo[104341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huzgtzhjudmqrgijlhjtcsaaqjkjnblw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162269.1736126-55-268720563378778/AnsiballZ_file.py'
Jan 23 09:57:49 compute-1 sudo[104341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:49 compute-1 python3.9[104343]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:49 compute-1 sudo[104341]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:57:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:50 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:50 compute-1 sudo[104516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fsebnorhvusrhqpsvlsptofxrlspwjag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162270.1022658-79-243370317010327/AnsiballZ_stat.py'
Jan 23 09:57:50 compute-1 sudo[104516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:50 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:50 compute-1 python3.9[104518]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:50 compute-1 sudo[104516]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:50.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:51 compute-1 sudo[104594]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qpzjmadsapwneawkgjthfrcccxezsdmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162270.1022658-79-243370317010327/AnsiballZ_file.py'
Jan 23 09:57:51 compute-1 sudo[104594]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:51.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:51 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf4002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:51 compute-1 python3.9[104596]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.u1_6i0p7 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:51 compute-1 sudo[104594]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:51 compute-1 ceph-mon[80126]: pgmap v129: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.116387) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272116621, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1422, "num_deletes": 252, "total_data_size": 4154785, "memory_usage": 4203160, "flush_reason": "Manual Compaction"}
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272132891, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1767345, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10469, "largest_seqno": 11886, "table_properties": {"data_size": 1762718, "index_size": 2087, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11771, "raw_average_key_size": 20, "raw_value_size": 1752658, "raw_average_value_size": 2980, "num_data_blocks": 94, "num_entries": 588, "num_filter_entries": 588, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162146, "oldest_key_time": 1769162146, "file_creation_time": 1769162272, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 16686 microseconds, and 7444 cpu microseconds.
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.133137) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1767345 bytes OK
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.133212) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.134600) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.134625) EVENT_LOG_v1 {"time_micros": 1769162272134619, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.134653) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 4148107, prev total WAL file size 4148107, number of live WAL files 2.
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.136272) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740030' seq:72057594037927935, type:22 .. '6D67727374617400323533' seq:0, type:0; will stop at (end)
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1725KB)], [21(13MB)]
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272136402, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 15549399, "oldest_snapshot_seqno": -1}
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 4245 keys, 13455552 bytes, temperature: kUnknown
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272225164, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 13455552, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13423230, "index_size": 20628, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10629, "raw_key_size": 107951, "raw_average_key_size": 25, "raw_value_size": 13341601, "raw_average_value_size": 3142, "num_data_blocks": 884, "num_entries": 4245, "num_filter_entries": 4245, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162272, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.225570) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 13455552 bytes
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.227157) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.0 rd, 151.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 13.1 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(16.4) write-amplify(7.6) OK, records in: 4709, records dropped: 464 output_compression: NoCompression
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.227194) EVENT_LOG_v1 {"time_micros": 1769162272227178, "job": 10, "event": "compaction_finished", "compaction_time_micros": 88841, "compaction_time_cpu_micros": 36567, "output_level": 6, "num_output_files": 1, "total_output_size": 13455552, "num_input_records": 4709, "num_output_records": 4245, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272227960, "job": 10, "event": "table_file_deletion", "file_number": 23}
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162272232776, "job": 10, "event": "table_file_deletion", "file_number": 21}
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.136120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232874) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232882) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232885) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232888) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:57:52.232891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:57:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:52 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf8002bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:52 compute-1 sudo[104747]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hcpyubcorsblaxcotwmxnbwdwpktwpxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162272.072949-139-273292765251690/AnsiballZ_stat.py'
Jan 23 09:57:52 compute-1 sudo[104747]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:52 compute-1 python3.9[104749]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:52 compute-1 sudo[104747]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:52 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:52 compute-1 sudo[104825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvsgnvldhvanihmdxgyfhjrtmebhpdhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162272.072949-139-273292765251690/AnsiballZ_file.py'
Jan 23 09:57:52 compute-1 sudo[104825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:52.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:53 compute-1 python3.9[104827]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.js2dmu92 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:53 compute-1 sudo[104825]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:53.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:53 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:53 compute-1 ceph-mon[80126]: pgmap v130: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:57:53 compute-1 sudo[104978]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tthsetxciglsdkwyhejwewhfciwginrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162273.3257084-178-221195414940422/AnsiballZ_file.py'
Jan 23 09:57:53 compute-1 sudo[104978]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:53 compute-1 python3.9[104980]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:53 compute-1 sudo[104978]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:54 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:54 compute-1 sudo[105130]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrxjmcycqgmnfzzmvgkvdjmzrxkcijal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162274.0628874-202-102977131623083/AnsiballZ_stat.py'
Jan 23 09:57:54 compute-1 sudo[105130]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:54 compute-1 python3.9[105132]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:54 compute-1 sudo[105130]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:54 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf8002bc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:54 compute-1 sudo[105208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oydgdsnlrqqfomlcdaqvttjdyqqhefyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162274.0628874-202-102977131623083/AnsiballZ_file.py'
Jan 23 09:57:54 compute-1 sudo[105208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:54.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:55 compute-1 python3.9[105210]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:55 compute-1 sudo[105208]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:55.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:55 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:55 compute-1 sudo[105361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emybugwhszutggnrwopoidqrlxbndcas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162275.187459-202-67448391788358/AnsiballZ_stat.py'
Jan 23 09:57:55 compute-1 sudo[105361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:55 compute-1 ceph-mon[80126]: pgmap v131: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:55 compute-1 python3.9[105363]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:55 compute-1 sudo[105361]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:55 compute-1 sudo[105439]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dzxhauyqbgkwyphyomklexmkswpdzuqo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162275.187459-202-67448391788358/AnsiballZ_file.py'
Jan 23 09:57:55 compute-1 sudo[105439]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:56 compute-1 python3.9[105441]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:57:56 compute-1 sudo[105439]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:56 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:56 compute-1 sudo[105591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aavucmhsabxopcgltrpuqlgviqwtzbic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162276.4059887-271-180385268809191/AnsiballZ_file.py'
Jan 23 09:57:56 compute-1 sudo[105591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:56 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf40091b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:56 compute-1 python3.9[105593]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:56 compute-1 sudo[105591]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:56.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:57 compute-1 ceph-mon[80126]: pgmap v132: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:57:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:57.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:57 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:57 compute-1 sudo[105744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eydqedwgkcbqmstexjjxamohzrhukzwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162277.0816414-295-127896297480487/AnsiballZ_stat.py'
Jan 23 09:57:57 compute-1 sudo[105744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:57 compute-1 python3.9[105746]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:57 compute-1 sudo[105747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:57:57 compute-1 sudo[105747]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:57:57 compute-1 sudo[105744]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:57 compute-1 sudo[105747]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:57:57 compute-1 sudo[105847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yrtmbiileqjjcssvpcgjetdygbucigsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162277.0816414-295-127896297480487/AnsiballZ_file.py'
Jan 23 09:57:57 compute-1 sudo[105847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:57 compute-1 python3.9[105849]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:57 compute-1 sudo[105847]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:58 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:58 compute-1 sudo[105999]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anrqhjgompovwfvfaxbflyuxpzzajbmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162278.2289014-331-108692086531926/AnsiballZ_stat.py'
Jan 23 09:57:58 compute-1 sudo[105999]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:58 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:58 compute-1 python3.9[106001]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:57:58 compute-1 sudo[105999]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:57:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:57:58.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:57:58 compute-1 sudo[106077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rzlyqddirtbgkmzrrahoafrcmxumcjkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162278.2289014-331-108692086531926/AnsiballZ_file.py'
Jan 23 09:57:58 compute-1 sudo[106077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:57:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:57:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:57:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:57:59.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:57:59 compute-1 python3.9[106079]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:57:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:57:59 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:57:59 compute-1 sudo[106077]: pam_unix(sudo:session): session closed for user root
Jan 23 09:57:59 compute-1 ceph-mon[80126]: pgmap v133: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:00 compute-1 sudo[106230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzwroxsyzlyurjvukxkjlbadcuikwaph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162279.4008305-367-219905249346743/AnsiballZ_systemd.py'
Jan 23 09:58:00 compute-1 sudo[106230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:58:00 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcf80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:00 compute-1 python3.9[106232]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:58:00 compute-1 systemd[1]: Reloading.
Jan 23 09:58:00 compute-1 systemd-sysv-generator[106263]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:58:00 compute-1 systemd-rc-local-generator[106258]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:58:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:58:00 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efce40023f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:00 compute-1 sudo[106230]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:00.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:01.151 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:01 compute-1 sudo[106419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owuklprvclvbbjekqnzbgeibtckyjytk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162280.918367-391-36568973666469/AnsiballZ_stat.py'
Jan 23 09:58:01 compute-1 sudo[106419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:58:01 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:01 compute-1 ceph-mon[80126]: pgmap v134: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:01 compute-1 python3.9[106421]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:01 compute-1 sudo[106419]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:01 compute-1 sudo[106498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlmennnonobyozkzkuforhvvggojnsuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162280.918367-391-36568973666469/AnsiballZ_file.py'
Jan 23 09:58:01 compute-1 sudo[106498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:01 compute-1 python3.9[106500]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:01 compute-1 sudo[106498]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:02 compute-1 kernel: ganesha.nfsd[104028]: segfault at 50 ip 00007efd7eb8332e sp 00007efce37fd210 error 4 in libntirpc.so.5.8[7efd7eb68000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 23 09:58:02 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 09:58:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[99508]: 23/01/2026 09:58:02 : epoch 697345e9 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7efcc8003c10 fd 38 proxy ignored for local
Jan 23 09:58:02 compute-1 systemd[1]: Started Process Core Dump (PID 106651/UID 0).
Jan 23 09:58:02 compute-1 sudo[106650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-valyemtsuziodpvwjrhbazqyxxteftbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162282.0624897-427-137192499280337/AnsiballZ_stat.py'
Jan 23 09:58:02 compute-1 sudo[106650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:02 compute-1 python3.9[106654]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:02 compute-1 sudo[106650]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:02 compute-1 sudo[106730]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-srlvsbyjepxrawbpvijbtbiuowbptffq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162282.0624897-427-137192499280337/AnsiballZ_file.py'
Jan 23 09:58:02 compute-1 sudo[106730]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:02.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:03 compute-1 python3.9[106732]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:03 compute-1 sudo[106730]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:03.154 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:03 compute-1 ceph-mon[80126]: pgmap v135: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 09:58:03 compute-1 sudo[106883]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sgcmjsehzmqbpjgdxkowubaorhdzhrzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162283.2040994-463-38592472371555/AnsiballZ_systemd.py'
Jan 23 09:58:03 compute-1 sudo[106883]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:03 compute-1 systemd-coredump[106653]: Process 99512 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 56:
                                                    #0  0x00007efd7eb8332e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007efd7eb8d900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 09:58:03 compute-1 systemd[1]: systemd-coredump@2-106651-0.service: Deactivated successfully.
Jan 23 09:58:03 compute-1 systemd[1]: systemd-coredump@2-106651-0.service: Consumed 1.293s CPU time.
Jan 23 09:58:03 compute-1 podman[106890]: 2026-01-23 09:58:03.736439387 +0000 UTC m=+0.028238564 container died 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True)
Jan 23 09:58:03 compute-1 python3.9[106885]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 09:58:03 compute-1 systemd[1]: Reloading.
Jan 23 09:58:03 compute-1 systemd-rc-local-generator[106925]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 09:58:03 compute-1 systemd-sysv-generator[106928]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 09:58:04 compute-1 systemd[1]: Starting Create netns directory...
Jan 23 09:58:04 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 09:58:04 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 09:58:04 compute-1 systemd[1]: Finished Create netns directory.
Jan 23 09:58:04 compute-1 sudo[106883]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:04 compute-1 systemd[1]: var-lib-containers-storage-overlay-be49650d80c1371f341a926e9e26d7b9b5cdb8123132173cb716ef3c860b069b-merged.mount: Deactivated successfully.
Jan 23 09:58:04 compute-1 ceph-mon[80126]: pgmap v136: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:04 compute-1 podman[106890]: 2026-01-23 09:58:04.752726933 +0000 UTC m=+1.044526080 container remove 32a2281e82fb75e76d63fdb5751754b253a28dce59c2dae2649a9b567c16d743 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 09:58:04 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 09:58:04 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 09:58:04 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.644s CPU time.
Jan 23 09:58:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:58:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:04.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:58:05 compute-1 python3.9[107120]: ansible-ansible.builtin.service_facts Invoked
Jan 23 09:58:05 compute-1 network[107137]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 09:58:05 compute-1 network[107138]: 'network-scripts' will be removed from distribution in near future.
Jan 23 09:58:05 compute-1 network[107139]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 09:58:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:05.157 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:58:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:06.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:07 compute-1 ceph-mon[80126]: pgmap v137: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:07.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095808 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:58:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:08.965 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:09.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:09 compute-1 sudo[107401]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lozommerervmdxvfnovxybrkjildkftv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162288.8929632-541-277433114650977/AnsiballZ_stat.py'
Jan 23 09:58:09 compute-1 sudo[107401]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:09 compute-1 ceph-mon[80126]: pgmap v138: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:09 compute-1 python3.9[107403]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:09 compute-1 sudo[107401]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:09 compute-1 sudo[107480]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-duccjrcobzcmmnwgogknclafloxbbubn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162288.8929632-541-277433114650977/AnsiballZ_file.py'
Jan 23 09:58:09 compute-1 sudo[107480]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:09 compute-1 python3.9[107482]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:09 compute-1 sudo[107480]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:10 compute-1 sudo[107632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xnblxmtnojlouwntnsvlfgwbpgtnqoon ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162290.2450063-580-104257509936421/AnsiballZ_file.py'
Jan 23 09:58:10 compute-1 sudo[107632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:10 compute-1 python3.9[107634]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:10 compute-1 sudo[107632]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:10.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:11.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:11 compute-1 sudo[107784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nlfjfqftvfcjgvhcqvendizoauisjxfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162290.9400215-604-204644521764970/AnsiballZ_stat.py'
Jan 23 09:58:11 compute-1 sudo[107784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:11 compute-1 ceph-mon[80126]: pgmap v139: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:11 compute-1 python3.9[107786]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:11 compute-1 sudo[107784]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:11 compute-1 sudo[107863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qzbibzyuuhlhwhiqcqakhcqjkvigdemu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162290.9400215-604-204644521764970/AnsiballZ_file.py'
Jan 23 09:58:11 compute-1 sudo[107863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:11 compute-1 python3.9[107865]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/var/lib/edpm-config/firewall/sshd-networks.yaml _original_basename=firewall.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/sshd-networks.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:11 compute-1 sudo[107863]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:12.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:13 compute-1 sudo[108015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bhegphebxdrhyzpiqvhcmodmlnglofmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162292.6334174-649-46359239917334/AnsiballZ_timezone.py'
Jan 23 09:58:13 compute-1 sudo[108015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:58:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:13.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:58:13 compute-1 python3.9[108017]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Jan 23 09:58:13 compute-1 systemd[1]: Starting Time & Date Service...
Jan 23 09:58:13 compute-1 systemd[1]: Started Time & Date Service.
Jan 23 09:58:13 compute-1 sudo[108015]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:13 compute-1 ceph-mon[80126]: pgmap v140: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:58:14 compute-1 sudo[108172]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruzvbbthdkgrytkyrothykqevrogbkan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162293.7590604-676-100267218145697/AnsiballZ_file.py'
Jan 23 09:58:14 compute-1 sudo[108172]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:14 compute-1 python3.9[108174]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:14 compute-1 sudo[108172]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:14 compute-1 ceph-mon[80126]: pgmap v141: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:58:14 compute-1 sudo[108325]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsnramvdzbtcunououknmjmgfywveian ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162294.43469-700-77709411666515/AnsiballZ_stat.py'
Jan 23 09:58:14 compute-1 sudo[108325]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:14 compute-1 python3.9[108327]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:14 compute-1 sudo[108325]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:14.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:15 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 3.
Jan 23 09:58:15 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:58:15 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.644s CPU time.
Jan 23 09:58:15 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 09:58:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:15.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:15 compute-1 sudo[108422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qykhlibjhbbmvxcnfhanzsfotmxtjglv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162294.43469-700-77709411666515/AnsiballZ_file.py'
Jan 23 09:58:15 compute-1 sudo[108422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:15 compute-1 podman[108450]: 2026-01-23 09:58:15.340188416 +0000 UTC m=+0.075712184 container create a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 23 09:58:15 compute-1 python3.9[108430]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 09:58:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 09:58:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:58:15 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 09:58:15 compute-1 podman[108450]: 2026-01-23 09:58:15.310411347 +0000 UTC m=+0.045935175 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 09:58:15 compute-1 podman[108450]: 2026-01-23 09:58:15.403210274 +0000 UTC m=+0.138734022 container init a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 09:58:15 compute-1 podman[108450]: 2026-01-23 09:58:15.408261463 +0000 UTC m=+0.143785191 container start a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_REF=squid)
Jan 23 09:58:15 compute-1 bash[108450]: a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3
Jan 23 09:58:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 09:58:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 09:58:15 compute-1 sudo[108422]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:15 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 09:58:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 09:58:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 09:58:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 09:58:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 09:58:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 09:58:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:58:15 compute-1 sudo[108657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pwzsxfgjhwfxnzbqixwqscmxgendwdru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162295.6039968-736-277504659997691/AnsiballZ_stat.py'
Jan 23 09:58:15 compute-1 sudo[108657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:16 compute-1 python3.9[108659]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:16 compute-1 sudo[108657]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:16 compute-1 sudo[108735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oipmnushpzbfarehznuostylnevydnrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162295.6039968-736-277504659997691/AnsiballZ_file.py'
Jan 23 09:58:16 compute-1 sudo[108735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:16 compute-1 python3.9[108737]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.t5sc_d7x recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:16 compute-1 sudo[108735]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:16.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:17.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:17 compute-1 sudo[108887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlmvpzesmrsdygigjjrulvpbmyyypygt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162296.9359307-772-152587807262260/AnsiballZ_stat.py'
Jan 23 09:58:17 compute-1 sudo[108887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:17 compute-1 python3.9[108889]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:17 compute-1 sudo[108887]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:17 compute-1 ceph-mon[80126]: pgmap v142: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:58:17 compute-1 sudo[108923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:58:17 compute-1 sudo[108923]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:17 compute-1 sudo[108923]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:17 compute-1 sudo[108991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jlsilgdyihzfhrhskdxxcaoghfrssarg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162296.9359307-772-152587807262260/AnsiballZ_file.py'
Jan 23 09:58:17 compute-1 sudo[108991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:17 compute-1 python3.9[108993]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:17 compute-1 sudo[108991]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:18 compute-1 sudo[109143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhcsuwownwpfozpixycxorjvjnadxumb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162298.1568294-811-257839611356845/AnsiballZ_command.py'
Jan 23 09:58:18 compute-1 sudo[109143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:18 compute-1 python3.9[109145]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:58:18 compute-1 sudo[109143]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:18.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:19.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:19 compute-1 ceph-mon[80126]: pgmap v143: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:58:19 compute-1 sudo[109297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vyecoymyttuucuoxsywuzngwbcmgmcnz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162299.1134796-835-69413866320452/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 09:58:19 compute-1 sudo[109297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:19 compute-1 python3[109299]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 09:58:19 compute-1 sudo[109297]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:20 compute-1 sudo[109449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gyrhhgtsfyftfodyutysusmvriwnubck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162300.0503063-859-18910983467708/AnsiballZ_stat.py'
Jan 23 09:58:20 compute-1 sudo[109449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:20 compute-1 python3.9[109451]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:58:20 compute-1 sudo[109449]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:20 compute-1 sudo[109527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bzcshyhfnmcjhxyzvcjpcvfdcwnhyaqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162300.0503063-859-18910983467708/AnsiballZ_file.py'
Jan 23 09:58:20 compute-1 sudo[109527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:20.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:21 compute-1 python3.9[109529]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:21 compute-1 sudo[109527]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:21.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:22 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:58:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:22 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:58:22 compute-1 sudo[109680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-otomxrqbanbysfmlwdcumjozmbduhjmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162301.437065-895-253409366867912/AnsiballZ_stat.py'
Jan 23 09:58:22 compute-1 sudo[109680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:22 compute-1 ceph-mon[80126]: pgmap v144: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:58:22 compute-1 python3.9[109682]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:22 compute-1 sudo[109680]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:22 compute-1 sudo[109805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evsqfzqznqtvkdxoxbhuuyuvctpusvxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162301.437065-895-253409366867912/AnsiballZ_copy.py'
Jan 23 09:58:22 compute-1 sudo[109805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:22 compute-1 python3.9[109807]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162301.437065-895-253409366867912/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:22 compute-1 sudo[109805]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:22.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:23.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:23 compute-1 sudo[109958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ahwgamlbajifftllvexgthlqleimvouz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162303.078882-940-105308113596816/AnsiballZ_stat.py'
Jan 23 09:58:23 compute-1 sudo[109958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:23 compute-1 python3.9[109960]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:23 compute-1 ceph-mon[80126]: pgmap v145: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:58:23 compute-1 sudo[109958]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:23 compute-1 sudo[110036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqdpqmkkaplmgzjgkqbmsoycpnolcxmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162303.078882-940-105308113596816/AnsiballZ_file.py'
Jan 23 09:58:23 compute-1 sudo[110036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:24 compute-1 python3.9[110038]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:24 compute-1 sudo[110036]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:24 compute-1 sudo[110188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pmtdtnhwuegbbazipsdpqpglaeoziwos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162304.4868226-976-85808491334497/AnsiballZ_stat.py'
Jan 23 09:58:24 compute-1 sudo[110188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:24.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:24 compute-1 python3.9[110190]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:25 compute-1 sudo[110188]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:25 compute-1 ceph-mon[80126]: pgmap v146: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:58:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:25.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:25 compute-1 sudo[110266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruafeshikclulrndoizycapawbiedoaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162304.4868226-976-85808491334497/AnsiballZ_file.py'
Jan 23 09:58:25 compute-1 sudo[110266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:25 compute-1 python3.9[110268]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:25 compute-1 sudo[110266]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:26 compute-1 sudo[110419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xuprfqwyqhhwzmxvbjfvvhrkcdrvpcup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162305.7080588-1013-280902938132709/AnsiballZ_stat.py'
Jan 23 09:58:26 compute-1 sudo[110419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:26 compute-1 python3.9[110421]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:26 compute-1 sudo[110419]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:26 compute-1 sudo[110497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dqmkcyqldkxucefevjhvrovtqhqhjjgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162305.7080588-1013-280902938132709/AnsiballZ_file.py'
Jan 23 09:58:26 compute-1 sudo[110497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:26 compute-1 python3.9[110499]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-rules.nft _original_basename=ruleset.j2 recurse=False state=file path=/etc/nftables/edpm-rules.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:26 compute-1 sudo[110497]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:26.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:27.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:27 compute-1 sudo[110649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sthhoqcopdowrnzcthtnsspkrbqlyvtc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162307.018462-1051-65262945188493/AnsiballZ_command.py'
Jan 23 09:58:27 compute-1 sudo[110649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:27 compute-1 ceph-mon[80126]: pgmap v147: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 09:58:27 compute-1 python3.9[110651]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:58:27 compute-1 sudo[110649]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:28 compute-1 sudo[110805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzumazvxptxppmpwegubesyhtpllicne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162307.7214286-1075-145385599907333/AnsiballZ_blockinfile.py'
Jan 23 09:58:28 compute-1 sudo[110805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:28 compute-1 python3.9[110807]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:58:28 compute-1 sudo[110805]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 09:58:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:28 compute-1 sudo[110973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eysnilikituocchrezxkbgwkhoywbgyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162308.676054-1102-120742399812914/AnsiballZ_file.py'
Jan 23 09:58:28 compute-1 sudo[110973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:28.997 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:29 compute-1 python3.9[110975]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:29 compute-1 sudo[110973]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:29.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:29 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8001970 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:29 compute-1 sudo[110976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:58:29 compute-1 sudo[110976]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:29 compute-1 sudo[110976]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:29 compute-1 sudo[111022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:58:29 compute-1 sudo[111022]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:29 compute-1 sudo[111191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myecxqqtteoxltwvriweynbzogqfgqbj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162309.3274343-1102-32975658511475/AnsiballZ_file.py'
Jan 23 09:58:29 compute-1 sudo[111191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:29 compute-1 ceph-mon[80126]: pgmap v148: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1020 B/s wr, 3 op/s
Jan 23 09:58:29 compute-1 python3.9[111193]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:29 compute-1 sudo[111022]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:29 compute-1 sudo[111191]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:30 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:30 compute-1 sudo[111360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssrrvetuoygrjuavcmraugarufsbkrgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162310.0312862-1147-179690312731860/AnsiballZ_mount.py'
Jan 23 09:58:30 compute-1 sudo[111360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:30 compute-1 python3.9[111362]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 09:58:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:30 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:30 compute-1 sudo[111360]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:30.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:31 compute-1 ceph-mon[80126]: pgmap v149: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 935 B/s wr, 2 op/s
Jan 23 09:58:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:58:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:58:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:58:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:58:31 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:58:31 compute-1 sudo[111512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kgpavbmutirxruoeharxbumyncdtxyvd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162310.8713346-1147-257910836421027/AnsiballZ_mount.py'
Jan 23 09:58:31 compute-1 sudo[111512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:58:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:31.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:58:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:31 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:31 compute-1 python3.9[111514]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Jan 23 09:58:31 compute-1 sudo[111512]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:31 compute-1 sshd-session[104036]: Connection closed by 192.168.122.30 port 43192
Jan 23 09:58:31 compute-1 sshd-session[104033]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:58:31 compute-1 systemd[1]: session-43.scope: Deactivated successfully.
Jan 23 09:58:31 compute-1 systemd[1]: session-43.scope: Consumed 30.505s CPU time.
Jan 23 09:58:31 compute-1 systemd-logind[807]: Session 43 logged out. Waiting for processes to exit.
Jan 23 09:58:31 compute-1 systemd-logind[807]: Removed session 43.
Jan 23 09:58:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:32 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095832 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:58:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:32 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:33.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:33.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:33 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:33 compute-1 ceph-mon[80126]: pgmap v150: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 935 B/s wr, 2 op/s
Jan 23 09:58:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:34 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:34 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:35.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:35.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:35 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:35 compute-1 ceph-mon[80126]: pgmap v151: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 09:58:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:58:36 compute-1 sudo[111542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:58:36 compute-1 sudo[111542]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:36 compute-1 sudo[111542]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:36 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:36 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:36 compute-1 ceph-mon[80126]: pgmap v152: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 09:58:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:58:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:37.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:37.215 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:37 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:37 compute-1 sshd-session[111568]: Accepted publickey for zuul from 192.168.122.30 port 43858 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:58:37 compute-1 systemd-logind[807]: New session 44 of user zuul.
Jan 23 09:58:37 compute-1 systemd[1]: Started Session 44 of User zuul.
Jan 23 09:58:37 compute-1 sshd-session[111568]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:58:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:37 compute-1 sudo[111572]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:58:37 compute-1 sudo[111572]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:37 compute-1 sudo[111572]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:38 compute-1 sudo[111746]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hatdrneaipgblztqfpxvhwukulgwszve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162317.7181695-19-265437139797220/AnsiballZ_tempfile.py'
Jan 23 09:58:38 compute-1 sudo[111746]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:38 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:38 compute-1 python3.9[111748]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Jan 23 09:58:38 compute-1 sudo[111746]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:38 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00021f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:39.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:39 compute-1 sudo[111898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qhflbnnglhovhoqabtuujwbmasmisdfi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162318.6137588-55-242058903759252/AnsiballZ_stat.py'
Jan 23 09:58:39 compute-1 sudo[111898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:39.218 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:39 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc001d50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:39 compute-1 python3.9[111900]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:58:39 compute-1 ceph-mon[80126]: pgmap v153: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 09:58:39 compute-1 sudo[111898]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:39 compute-1 sudo[112053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-inoxqcgyvawqzawvsgezqxxxmsziqimo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162319.4885173-79-103116354807812/AnsiballZ_slurp.py'
Jan 23 09:58:39 compute-1 sudo[112053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:40 compute-1 python3.9[112055]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Jan 23 09:58:40 compute-1 sudo[112053]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:40 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:40 compute-1 sudo[112205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzwwwhayhpguhcrukssqohpynwppmuyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162320.3643203-103-1280156936906/AnsiballZ_stat.py'
Jan 23 09:58:40 compute-1 sudo[112205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:40 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:40 compute-1 python3.9[112207]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible.fvcogk_j follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:58:40 compute-1 sudo[112205]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:58:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:41.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:58:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:41 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:41.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:41 compute-1 sudo[112331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yemilapywzpwvfzsxgtmdsabpjtsrsnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162320.3643203-103-1280156936906/AnsiballZ_copy.py'
Jan 23 09:58:41 compute-1 sudo[112331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:41 compute-1 ceph-mon[80126]: pgmap v154: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:58:41 compute-1 python3.9[112333]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible.fvcogk_j mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162320.3643203-103-1280156936906/.source.fvcogk_j _original_basename=.1kcbc928 follow=False checksum=6c63675b4fda7e0d01c328fcbe34dc890491aeeb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:41 compute-1 sudo[112331]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:42 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:42 compute-1 sudo[112483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-geipmmvybgyogtrkkkyrttxeqvpbjnnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162321.8814518-148-81270540054412/AnsiballZ_setup.py'
Jan 23 09:58:42 compute-1 sudo[112483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:42 compute-1 ceph-mon[80126]: pgmap v155: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:58:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:42 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:42 compute-1 python3.9[112485]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:58:42 compute-1 sudo[112483]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:43.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:43 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:43.226 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:43 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 09:58:43 compute-1 sudo[112639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oyntpcvlsoeszocqhrkqwvomqrkvvmbu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162323.1574054-173-49388851796709/AnsiballZ_blockinfile.py'
Jan 23 09:58:43 compute-1 sudo[112639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:44 compute-1 python3.9[112641]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC+cj2so8SS29oYZ1K+7e02qi6fVkGXJzGMkIN9mgJPLCBtQ6vpBYEObTZZXuMIHhdiMUAp6RDjs11OXDkAB9R7e2ncjMKn7J2EHbmceT7rNq9L0w+QaLKFxl+xdJQ9QtO9ioNgJFXXQZt/IOeE8S4I5yhEM5jn+YEW0LPbp99Wz1d1Ob4GI1t0hCEv/4ayC3nRIXkuIhl7mrV0s22F8NE8f0hZZKaw1u8xmmpbD8ZVBsC6cxWE3kIQBmHu8q9tylaZjLsjGxBDUF9ko3bxeppvLPDMem89VLQCWbgmOHl5ZIPsyNglusTIBUp8uA7g+Agz1uMojClMHnsZl68WjbCAVcRA9y/UgXphGyEYZCUJMv8CjYKzxriyHALZl6YFSyC5ELlEAxL8fyTwtXhQ1+e/lI9Ak3n4suC6JyH0NQ27MPIf7riyUFJLw9lZaDerZOkvI7/Y2PfRvdfyZ57g/xgGeLY0Ch30SFVC04lNXIpsOWbLBOg0BMP9ZiciAYAF9Yc=
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIIreWuVcekgp7kF5pU+4TIKLHZyhuqd4Ly312ExEA5EG
                                             compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJWfXOTsTXqDhdGhW7VcUXsYqCS7TzCPyaa9/dA9e0xKjnni1/GRM8FdYXWYbGsNnBQFWk3/pXD6sj3jKzK34AM=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDA/6JnQZ3CFC7xgv4DrvdZizVbVnsolKcWkvqzGu1hFHGmOEb7ehbxGPHBnp2N9iRf13H12EI0qNI6A2f44V0oXE3SP+fpJ6PVYQRQpKqTEiweqZaHEyYE2FnKy0HDQisg5hwr1egYLjGXChdkyqWSokL1LqaCyD2+EcOzUvC/GuVQ7eQnQBIGBpYAnNzS/64KKOZ0+0soOPJGxVCma6JN/2GcCunX6j3HmkOOQeuEFETXfUPHh1ylu2+3yINl34ERJN5YwgR/S+BKENOsJTu5XkYTCvc90CuvfkoF9K5Y2yE5nKwZaSf7n2SbUPil2Zph4l7opsd5IKxi6k2mVzw/CO2NHr136BZ06+sKXytDgorWqWzqnci8zfxeYF3D7q7AXD+IDVMP5T6op93oS2enAQFHG1vTLB0otQqnxUgNANbJkrKgXAS8G8I1m2sPz+qOFuuZa2/nqhzrd6/DEur5VoW6n9c/OcrbfapLEzD1jQDmsQI7oZkT++dt3Ogb3Vk=
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIII1sLqY7Nqi1A3CKXLokfn1vrns/lK1gUkDNSlbek2o
                                             compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM9QZXHUsthFMKA5Si4Htl7MIwK0G4VAltQgbo39JJHrgD7h27U1jbnuJQ1S2bBX8FMSkqf5TPmM7Gr9QOATO+4=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDWbrXZxuAw0n/xJmOvWW/Qbg53ya2CuJKzcHA+OvDpHLHGxkEuiUhwKvqUbfSTzn0o1M00OYITJIvZVINGRtQC7hGvBPWLVBON097mcmnju857I72U3dGdvGhnEUHyrglCV+xSkafQTTlnY9B59EKImUs/kiwRy3cYDWkCgthJgiPA4QSw6WrzaqpY2ET+7n+yY31EOagGA3ufW43qFbHX4diFuXpS1I1PLvvA4KINlMlsFcyR29j4nQk/vb5hMpLmBOlfVH16CXZC98a0ltp9ib7F3e1Wjdogj92kxwfQMYIeQEBp11Tc/PY5U90J51oyk8xYOKfsP3+r9yczmfRDjwR3+tzUMKyZYAsKQVcOGQC7x9sEXg3mBeXRVrlIVZFMuNVcYq4CY40fDIybcI25GxgRbQR7ZUWODG1SL7RF02Z+LQB6APXkzxdQUWLWPryj/EtOgnHQ1I0+BJTWrqGkKbSj41jhRTfS+MZvRXAJ+fNyZFhpkHo54DrCii4cbyM=
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGRPkwTcFVg/dIKRq29iWBfkoVFqIQ1pXOCPxfcGWRFF
                                             compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGf/hJ2dg/PRwojw63FLyKqua+ChKP+2bc7Eb0p70H6ve1elFVeY8lVRXx33JWc2m/XfgSWPNcUs9zBG8QcFVak=
                                              create=True mode=0644 path=/tmp/ansible.fvcogk_j state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:44 compute-1 sudo[112639]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:44 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be00095a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:44 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:44 compute-1 sudo[112791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fqqqylopnunrwrdtnjipyggptrjdmgqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162324.3801212-197-43610892962821/AnsiballZ_command.py'
Jan 23 09:58:44 compute-1 sudo[112791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:44 compute-1 python3.9[112793]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.fvcogk_j' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:58:45 compute-1 sudo[112791]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:45.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:45 compute-1 ceph-mon[80126]: pgmap v156: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:45 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:45.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:45 compute-1 sudo[112946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usnzwvhdwglatdytkdlspwibniuapvrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162325.2921538-221-155824038640590/AnsiballZ_file.py'
Jan 23 09:58:45 compute-1 sudo[112946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:45 compute-1 python3.9[112948]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.fvcogk_j state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:45 compute-1 sudo[112946]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:46 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:46 compute-1 sshd-session[111571]: Connection closed by 192.168.122.30 port 43858
Jan 23 09:58:46 compute-1 sshd-session[111568]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:58:46 compute-1 systemd[1]: session-44.scope: Deactivated successfully.
Jan 23 09:58:46 compute-1 systemd[1]: session-44.scope: Consumed 5.017s CPU time.
Jan 23 09:58:46 compute-1 systemd-logind[807]: Session 44 logged out. Waiting for processes to exit.
Jan 23 09:58:46 compute-1 systemd-logind[807]: Removed session 44.
Jan 23 09:58:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:46 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:47.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:47 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0031e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:47.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:48 compute-1 ceph-mon[80126]: pgmap v157: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:58:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:49.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:58:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:49 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:49.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:49 compute-1 ceph-mon[80126]: pgmap v158: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:58:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:50 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:58:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:50 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:51.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:51 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:51.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:51 compute-1 ceph-mon[80126]: pgmap v159: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:52 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:52 compute-1 ceph-mon[80126]: pgmap v160: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 09:58:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:52 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:53.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:53 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:53.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:53 compute-1 sshd-session[112977]: Accepted publickey for zuul from 192.168.122.30 port 45032 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:58:53 compute-1 systemd-logind[807]: New session 45 of user zuul.
Jan 23 09:58:53 compute-1 systemd[1]: Started Session 45 of User zuul.
Jan 23 09:58:53 compute-1 sshd-session[112977]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:58:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:54 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:54 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:54 compute-1 python3.9[113130]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:58:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:55.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:55 compute-1 ceph-mon[80126]: pgmap v161: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:55 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:58:55 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 09:58:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:55 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:55.245 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:55 compute-1 sudo[113286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wdfjmzemdotywniozzxaogqzjpeuymwo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162335.229457-52-191653903774641/AnsiballZ_systemd.py'
Jan 23 09:58:55 compute-1 sudo[113286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:56 compute-1 python3.9[113288]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 09:58:56 compute-1 sudo[113286]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:56 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb80016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:56 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:56 compute-1 sudo[113440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wmwumtpzvryvnqgbhbkudrwfxdvadgmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162336.4993608-76-150507486981474/AnsiballZ_systemd.py'
Jan 23 09:58:56 compute-1 sudo[113440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:57.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:57 compute-1 python3.9[113442]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 09:58:57 compute-1 sudo[113440]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095857 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:58:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:57 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:58:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:57.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:58:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:58:57 compute-1 sudo[113544]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:58:57 compute-1 sudo[113544]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:58:57 compute-1 sudo[113544]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:57 compute-1 ceph-mon[80126]: pgmap v162: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:58:57 compute-1 sudo[113619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxdbkeweroyajcchtwrooyrqxprzrulk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162337.4691303-103-275176085779435/AnsiballZ_command.py'
Jan 23 09:58:57 compute-1 sudo[113619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:58 compute-1 python3.9[113621]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:58:58 compute-1 sudo[113619]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:58 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:58 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:58 compute-1 sudo[113772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzhdfmfosorfsddkmjhuxrugecbccwje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162338.3330495-127-99426570102335/AnsiballZ_stat.py'
Jan 23 09:58:58 compute-1 sudo[113772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:58 compute-1 python3.9[113774]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:58:58 compute-1 sudo[113772]: pam_unix(sudo:session): session closed for user root
Jan 23 09:58:59 compute-1 ceph-mon[80126]: pgmap v163: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 09:58:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:58:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:58:59.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:58:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:58:59 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:58:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:58:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:58:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:58:59.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:58:59 compute-1 sudo[113925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jiwphyoomategnecbmjfgxjvtsnppxkk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162339.196799-154-62026483279175/AnsiballZ_file.py'
Jan 23 09:58:59 compute-1 sudo[113925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:58:59 compute-1 python3.9[113927]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:58:59 compute-1 sudo[113925]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:00 compute-1 sshd-session[112980]: Connection closed by 192.168.122.30 port 45032
Jan 23 09:59:00 compute-1 sshd-session[112977]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:59:00 compute-1 systemd[1]: session-45.scope: Deactivated successfully.
Jan 23 09:59:00 compute-1 systemd[1]: session-45.scope: Consumed 3.955s CPU time.
Jan 23 09:59:00 compute-1 systemd-logind[807]: Session 45 logged out. Waiting for processes to exit.
Jan 23 09:59:00 compute-1 systemd-logind[807]: Removed session 45.
Jan 23 09:59:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:00 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:00 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:01.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:01 compute-1 ceph-mon[80126]: pgmap v164: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:01 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd4001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:01.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:02 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:02 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:03.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:03 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:03 compute-1 ceph-mon[80126]: pgmap v165: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:03.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:04 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd4001110 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:04 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:05.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:05 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:59:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:05.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:59:05 compute-1 ceph-mon[80126]: pgmap v166: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:59:06 compute-1 sshd-session[113956]: Accepted publickey for zuul from 192.168.122.30 port 41038 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:59:06 compute-1 systemd-logind[807]: New session 46 of user zuul.
Jan 23 09:59:06 compute-1 systemd[1]: Started Session 46 of User zuul.
Jan 23 09:59:06 compute-1 sshd-session[113956]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:59:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:06 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:06 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd40022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:07 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:59:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:07.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:07 compute-1 python3.9[114109]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:59:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:07 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:07.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:07 compute-1 ceph-mon[80126]: pgmap v167: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:07 compute-1 sudo[114264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fhkibdmglwhpjwwdvkmgprlshwymfmak ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162347.6995738-58-258170651856838/AnsiballZ_setup.py'
Jan 23 09:59:07 compute-1 sudo[114264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:08 compute-1 python3.9[114266]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 09:59:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:08 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:08 compute-1 sudo[114264]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:08 compute-1 ceph-mon[80126]: pgmap v168: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:08 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:08 compute-1 sudo[114348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uwnusaoyjqlxmgabusrofycaxydqmpli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162347.6995738-58-258170651856838/AnsiballZ_dnf.py'
Jan 23 09:59:08 compute-1 sudo[114348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:09.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:09 compute-1 python3.9[114350]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Jan 23 09:59:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:09 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd40022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:09.266 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 09:59:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:59:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 09:59:10 compute-1 sudo[114348]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:10 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:11.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:11 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:11.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:11 compute-1 python3.9[114503]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 09:59:11 compute-1 ceph-mon[80126]: pgmap v169: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:12 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bd40022a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:12 compute-1 ceph-mon[80126]: pgmap v170: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:12 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:13 compute-1 python3.9[114654]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 09:59:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:13.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:13 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:13.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:13 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 09:59:13 compute-1 python3.9[114805]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:59:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:14 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:14 compute-1 python3.9[114955]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 09:59:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:14 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:15 compute-1 ceph-mon[80126]: pgmap v171: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:15.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:15 compute-1 sshd-session[113959]: Connection closed by 192.168.122.30 port 41038
Jan 23 09:59:15 compute-1 sshd-session[113956]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:59:15 compute-1 systemd[1]: session-46.scope: Deactivated successfully.
Jan 23 09:59:15 compute-1 systemd[1]: session-46.scope: Consumed 5.738s CPU time.
Jan 23 09:59:15 compute-1 systemd-logind[807]: Session 46 logged out. Waiting for processes to exit.
Jan 23 09:59:15 compute-1 systemd-logind[807]: Removed session 46.
Jan 23 09:59:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:15 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:15.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:16 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:16 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:17.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:17 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:17.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:17 compute-1 ceph-mon[80126]: pgmap v172: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:17 compute-1 sudo[114984]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:59:17 compute-1 sudo[114984]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:17 compute-1 sudo[114984]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:18 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8002290 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:18 compute-1 ceph-mon[80126]: pgmap v173: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 09:59:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:18 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000029s ======
Jan 23 09:59:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:19.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000029s
Jan 23 09:59:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095919 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 09:59:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:19 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:19.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:20 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:59:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:20 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:21.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:21 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:21.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:21 compute-1 ceph-mon[80126]: pgmap v174: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:22 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:22 compute-1 sshd-session[115012]: Accepted publickey for zuul from 192.168.122.30 port 41940 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 09:59:22 compute-1 systemd-logind[807]: New session 47 of user zuul.
Jan 23 09:59:22 compute-1 systemd[1]: Started Session 47 of User zuul.
Jan 23 09:59:22 compute-1 sshd-session[115012]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 09:59:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:22 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:23 compute-1 ceph-mon[80126]: pgmap v175: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 09:59:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:23.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:23 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:23.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:23 compute-1 python3.9[115165]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 09:59:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:24 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:24 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:25 compute-1 sudo[115320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcnuqjubukqlzedmoikcypgecgwnidhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162364.6070142-105-42532026184715/AnsiballZ_file.py'
Jan 23 09:59:25 compute-1 sudo[115320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:25.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:25 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:25.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:25 compute-1 python3.9[115322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:25 compute-1 sudo[115320]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:25 compute-1 ceph-mon[80126]: pgmap v176: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:25 compute-1 sudo[115473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqtvkfnyqusyjbzzubgnncpgidrsfzhc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162365.4762168-105-100279490272221/AnsiballZ_file.py'
Jan 23 09:59:25 compute-1 sudo[115473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:25 compute-1 python3.9[115475]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:25 compute-1 sudo[115473]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:26 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0001d70 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:26 compute-1 sudo[115625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqfelwcbwmtedotisoqzrxawwdknuaue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162366.1367166-150-121561062680685/AnsiballZ_stat.py'
Jan 23 09:59:26 compute-1 sudo[115625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:26 compute-1 ceph-mon[80126]: pgmap v177: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:26 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:26 compute-1 python3.9[115627]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:26 compute-1 sudo[115625]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:27.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:27 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:27.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:27 compute-1 sudo[115748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgtvmifkqnjlafksfwxebexzzpdrtvam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162366.1367166-150-121561062680685/AnsiballZ_copy.py'
Jan 23 09:59:27 compute-1 sudo[115748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:27 compute-1 python3.9[115751]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162366.1367166-150-121561062680685/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=200902694b7ce68180eae274ebcbc81826cfce70 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:27 compute-1 sudo[115748]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:27 compute-1 sudo[115901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eszrhwrhpckxfevztkqfwohzdupthlre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162367.7156394-150-201791179724440/AnsiballZ_stat.py'
Jan 23 09:59:27 compute-1 sudo[115901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:28 compute-1 python3.9[115903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:28 compute-1 sudo[115901]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:28 compute-1 sudo[116024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rdnatbvsfvbhxftovvktuzvjpwfmtwrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162367.7156394-150-201791179724440/AnsiballZ_copy.py'
Jan 23 09:59:28 compute-1 sudo[116024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:28 compute-1 python3.9[116026]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162367.7156394-150-201791179724440/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=ff17d6d1438a69ae92e7570d79b66fb807ae4885 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:28 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:28 compute-1 sudo[116024]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:29 compute-1 ceph-mon[80126]: pgmap v178: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000030s ======
Jan 23 09:59:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:29.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000030s
Jan 23 09:59:29 compute-1 sudo[116176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pbnkeoldmxvfyycgwcvmllamgrwoobjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162368.931849-150-129764174144882/AnsiballZ_stat.py'
Jan 23 09:59:29 compute-1 sudo[116176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:29 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:29.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:29 compute-1 python3.9[116178]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:29 compute-1 sudo[116176]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:29 compute-1 sudo[116300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ezengsbyagfywsxcmvlwzzyquejxcjib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162368.931849-150-129764174144882/AnsiballZ_copy.py'
Jan 23 09:59:29 compute-1 sudo[116300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:29 compute-1 python3.9[116302]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162368.931849-150-129764174144882/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=663b16ef7007139396e1a67c87fa3c37816c2c66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:29 compute-1 sudo[116300]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:30 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:30 compute-1 sudo[116452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uysxsijoyugevuwhyvtzeqqjhylxqepi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162370.1618488-280-218395003395449/AnsiballZ_file.py'
Jan 23 09:59:30 compute-1 sudo[116452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.475126) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370475268, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1154, "num_deletes": 251, "total_data_size": 2885437, "memory_usage": 2912432, "flush_reason": "Manual Compaction"}
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Jan 23 09:59:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:30 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370805650, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 1881107, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 11891, "largest_seqno": 13040, "table_properties": {"data_size": 1876037, "index_size": 2594, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1413, "raw_key_size": 10506, "raw_average_key_size": 19, "raw_value_size": 1865914, "raw_average_value_size": 3386, "num_data_blocks": 116, "num_entries": 551, "num_filter_entries": 551, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162273, "oldest_key_time": 1769162273, "file_creation_time": 1769162370, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 330716 microseconds, and 7290 cpu microseconds.
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.805872) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 1881107 bytes OK
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.805944) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.812193) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.812317) EVENT_LOG_v1 {"time_micros": 1769162370812297, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.812361) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 2879868, prev total WAL file size 2879868, number of live WAL files 2.
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.814038) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F7300353032' seq:72057594037927935, type:22 .. '7061786F7300373534' seq:0, type:0; will stop at (end)
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(1837KB)], [24(12MB)]
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370814207, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 15336659, "oldest_snapshot_seqno": -1}
Jan 23 09:59:30 compute-1 python3.9[116454]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:30 compute-1 sudo[116452]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 4280 keys, 13223994 bytes, temperature: kUnknown
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370928940, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 13223994, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13192507, "index_size": 19665, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 10757, "raw_key_size": 109447, "raw_average_key_size": 25, "raw_value_size": 13111325, "raw_average_value_size": 3063, "num_data_blocks": 828, "num_entries": 4280, "num_filter_entries": 4280, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162370, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.929192) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 13223994 bytes
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.930801) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 133.6 rd, 115.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 12.8 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(15.2) write-amplify(7.0) OK, records in: 4796, records dropped: 516 output_compression: NoCompression
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.930846) EVENT_LOG_v1 {"time_micros": 1769162370930829, "job": 12, "event": "compaction_finished", "compaction_time_micros": 114797, "compaction_time_cpu_micros": 52375, "output_level": 6, "num_output_files": 1, "total_output_size": 13223994, "num_input_records": 4796, "num_output_records": 4280, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370931321, "job": 12, "event": "table_file_deletion", "file_number": 26}
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162370933470, "job": 12, "event": "table_file_deletion", "file_number": 24}
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.813730) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933572) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933579) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933581) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933582) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:30 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-09:59:30.933585) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 09:59:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:31.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:31 compute-1 sudo[116604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qfsxgxslecgwnsqjewceikguhkhewmqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162371.0061786-280-237961642238629/AnsiballZ_file.py'
Jan 23 09:59:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:31 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:31 compute-1 sudo[116604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:31.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:31 compute-1 ceph-mon[80126]: pgmap v179: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:59:31 compute-1 python3.9[116606]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:31 compute-1 sudo[116604]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:32 compute-1 sudo[116757]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eqkkoxihdhdwxovyutaosfwjqisjppls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162371.7146757-329-194713526187765/AnsiballZ_stat.py'
Jan 23 09:59:32 compute-1 sudo[116757]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:32 compute-1 python3.9[116759]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:32 compute-1 sudo[116757]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:32 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:32 compute-1 sudo[116880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khcjqdchdkgabmgbujyhaurxvowkpeuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162371.7146757-329-194713526187765/AnsiballZ_copy.py'
Jan 23 09:59:32 compute-1 sudo[116880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:32 compute-1 python3.9[116882]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162371.7146757-329-194713526187765/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=f740c4b30b5527eca1229a1da8351348fcc44551 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:32 compute-1 sudo[116880]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:32 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:33.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:33 compute-1 sudo[117032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsscgbsordinyauooebelrqthsnnlyei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162372.8842256-329-237368036976808/AnsiballZ_stat.py'
Jan 23 09:59:33 compute-1 sudo[117032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:33 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:33.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:33 compute-1 python3.9[117034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:33 compute-1 sudo[117032]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:33 compute-1 sudo[117156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqahykuiwxziyxtpbfphdhvolnrtumjf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162372.8842256-329-237368036976808/AnsiballZ_copy.py'
Jan 23 09:59:33 compute-1 sudo[117156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:33 compute-1 ceph-mon[80126]: pgmap v180: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 09:59:33 compute-1 python3.9[117158]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162372.8842256-329-237368036976808/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=7ea5769d722c11e7459792c631f886a53fdd1360 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:33 compute-1 sudo[117156]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:34 compute-1 sudo[117308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dwadtakthkzsnwevrhreyuoyzrpfwdup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162374.035501-329-77376745714389/AnsiballZ_stat.py'
Jan 23 09:59:34 compute-1 sudo[117308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:34 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:34 compute-1 python3.9[117310]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:34 compute-1 sudo[117308]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:34 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:34 compute-1 sudo[117431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjwxiwsfnfvrevrrcezvmsfrshbwhjoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162374.035501-329-77376745714389/AnsiballZ_copy.py'
Jan 23 09:59:34 compute-1 sudo[117431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:35 compute-1 python3.9[117433]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162374.035501-329-77376745714389/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=b196055dd1ec29a2d4bd394f7949ec509db05ad5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:35 compute-1 sudo[117431]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:59:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:35.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:59:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:35 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:35.308 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:35 compute-1 sudo[117584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzghqujpumelfjkhxflptagvyqkaimcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162375.2786736-448-224071120735324/AnsiballZ_file.py'
Jan 23 09:59:35 compute-1 sudo[117584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:35 compute-1 python3.9[117586]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:35 compute-1 sudo[117584]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:35 compute-1 ceph-mon[80126]: pgmap v181: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:36 compute-1 sudo[117736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xacxbsyitewdzbubybbpnklvwbojrvok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162375.894948-448-38050028262505/AnsiballZ_file.py'
Jan 23 09:59:36 compute-1 sudo[117736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:36 compute-1 sudo[117739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:59:36 compute-1 sudo[117739]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:36 compute-1 sudo[117739]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:36 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:36 compute-1 python3.9[117738]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:36 compute-1 sudo[117764]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 09:59:36 compute-1 sudo[117764]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:36 compute-1 sudo[117736]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:36 compute-1 sudo[117764]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:36 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:36 compute-1 sudo[117959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jomujhflnulfvwhbctbkmgxrjogjtezt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162376.5885682-495-92008184019561/AnsiballZ_stat.py'
Jan 23 09:59:36 compute-1 sudo[117959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:59:36 compute-1 ceph-mon[80126]: pgmap v182: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:37 compute-1 python3.9[117961]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:37 compute-1 sudo[117959]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:37.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:37 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:59:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:37.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:59:37 compute-1 sudo[118083]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htbugdznswjccrqwefruylvxsthtlnsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162376.5885682-495-92008184019561/AnsiballZ_copy.py'
Jan 23 09:59:37 compute-1 sudo[118083]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:37 compute-1 python3.9[118085]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162376.5885682-495-92008184019561/.source.crt _original_basename=compute-1.ctlplane.example.com-tls.crt follow=False checksum=a4c0461c7922277b00b636dd64d46b12688eb9b6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:37 compute-1 sudo[118083]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:37 compute-1 sudo[118110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 09:59:37 compute-1 sudo[118110]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:37 compute-1 sudo[118110]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:37 compute-1 sudo[118158]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 09:59:37 compute-1 sudo[118158]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:37 compute-1 sudo[118242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:59:37 compute-1 sudo[118242]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:37 compute-1 sudo[118242]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:37 compute-1 sudo[118310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izxdzmwczmfdehatuydqnrylzbcirtfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162377.7212057-495-19416349126539/AnsiballZ_stat.py'
Jan 23 09:59:37 compute-1 sudo[118310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:38 compute-1 python3.9[118314]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:38 compute-1 sudo[118310]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:38 compute-1 sudo[118158]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:38 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:38 compute-1 sudo[118464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nbuwygdxzmpzpysuwbxgokakrolepyhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162377.7212057-495-19416349126539/AnsiballZ_copy.py'
Jan 23 09:59:38 compute-1 sudo[118464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:38 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:38 compute-1 python3.9[118466]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162377.7212057-495-19416349126539/.source.crt _original_basename=compute-1.ctlplane.example.com-ca.crt follow=False checksum=7ea5769d722c11e7459792c631f886a53fdd1360 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:38 compute-1 sudo[118464]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:39.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095939 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:59:39 compute-1 sudo[118616]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtygtkcqlybixthhsuhyfkwzbinbcver ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162378.960454-495-72322725231718/AnsiballZ_stat.py'
Jan 23 09:59:39 compute-1 sudo[118616]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:39 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:39.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:39 compute-1 python3.9[118618]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:39 compute-1 sudo[118616]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:39 compute-1 ceph-mon[80126]: pgmap v183: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 09:59:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:59:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 09:59:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 09:59:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 09:59:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 09:59:39 compute-1 sudo[118740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-votalsqihrrydmbdlpfbvcrlpqvkliau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162378.960454-495-72322725231718/AnsiballZ_copy.py'
Jan 23 09:59:39 compute-1 sudo[118740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:39 compute-1 python3.9[118742]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162378.960454-495-72322725231718/.source.key _original_basename=compute-1.ctlplane.example.com-tls.key follow=False checksum=9a1d9e1331601b6c16031f0051d14a3eadf04541 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:39 compute-1 sudo[118740]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:40 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:40 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:41 compute-1 ceph-mon[80126]: pgmap v184: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:41 compute-1 sudo[118892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zpczxbbwstcjwjtfxqbpzijkkzejxciy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162380.795975-655-215567569208631/AnsiballZ_file.py'
Jan 23 09:59:41 compute-1 sudo[118892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:41.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:41 compute-1 python3.9[118894]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:41 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:41 compute-1 sudo[118892]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:41.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:41 compute-1 sudo[119045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmhtjjvbfozsjglhbnooysxskchqyocs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162381.4740531-681-169956418797010/AnsiballZ_stat.py'
Jan 23 09:59:41 compute-1 sudo[119045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:42 compute-1 python3.9[119047]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:42 compute-1 sudo[119045]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:42 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:42 compute-1 sudo[119168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jcyoqeqzhzjvmacrxhpbhtnnoeqxuawc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162381.4740531-681-169956418797010/AnsiballZ_copy.py'
Jan 23 09:59:42 compute-1 sudo[119168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:42 compute-1 python3.9[119170]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162381.4740531-681-169956418797010/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:42 compute-1 sudo[119168]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:42 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:43 compute-1 sudo[119320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfafqtjhjvjamsyyxsjnknwqymcrsabn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162382.763821-737-267797127194785/AnsiballZ_file.py'
Jan 23 09:59:43 compute-1 sudo[119320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:43.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:43 compute-1 python3.9[119322]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:43 compute-1 sudo[119320]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:43 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:43.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:44 compute-1 sudo[119473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wxdeyrfbtjhhjipuwkhqrimtvglzpose ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162383.4424393-762-250779937840994/AnsiballZ_stat.py'
Jan 23 09:59:44 compute-1 sudo[119473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:44 compute-1 python3.9[119475]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:44 compute-1 sudo[119473]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:44 compute-1 ceph-mon[80126]: pgmap v185: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 09:59:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:44 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:44 compute-1 sudo[119596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efeopytzukiauzjzeyadvpilfnqxfbxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162383.4424393-762-250779937840994/AnsiballZ_copy.py'
Jan 23 09:59:44 compute-1 sudo[119596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:44 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:44 compute-1 python3.9[119598]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162383.4424393-762-250779937840994/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:44 compute-1 sudo[119596]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:45.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:45 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb8003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:45.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:45 compute-1 sudo[119749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-relwetbbrlurltftphecqnouhkopoxwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162385.190026-816-161620018455321/AnsiballZ_file.py'
Jan 23 09:59:45 compute-1 sudo[119749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:45 compute-1 python3.9[119751]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:45 compute-1 sudo[119749]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:46 compute-1 sudo[119901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjyjlbefdfgorntybizqjhkuwuefqjfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162385.9573383-840-251399452380848/AnsiballZ_stat.py'
Jan 23 09:59:46 compute-1 sudo[119901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:46 compute-1 ceph-mon[80126]: pgmap v186: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:46 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bcc0042e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:46 compute-1 python3.9[119903]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:46 compute-1 sudo[119901]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:46 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bc8003f60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:46 compute-1 sudo[120026]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jbtmjlfmiganbohxkdklrxqwcphirvfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162385.9573383-840-251399452380848/AnsiballZ_copy.py'
Jan 23 09:59:46 compute-1 sudo[120026]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:47 compute-1 python3.9[120028]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162385.9573383-840-251399452380848/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:47 compute-1 sudo[120026]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:47 compute-1 sshd-session[119965]: Invalid user sol from 45.148.10.240 port 57622
Jan 23 09:59:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:47.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:47 compute-1 sshd-session[119965]: Connection closed by invalid user sol 45.148.10.240 port 57622 [preauth]
Jan 23 09:59:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:47 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:47.328 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:47 compute-1 sudo[120179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwuuwrcsxjimlhssvlzykncowomwertu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162387.245293-885-36492521832754/AnsiballZ_file.py'
Jan 23 09:59:47 compute-1 sudo[120179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:47 compute-1 python3.9[120181]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:47 compute-1 sudo[120179]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:48 compute-1 sudo[120331]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzelvsypumdjkwkdouzdiryxnkvctyoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162387.9418838-910-166972745957666/AnsiballZ_stat.py'
Jan 23 09:59:48 compute-1 sudo[120331]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:48 compute-1 python3.9[120333]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:48 compute-1 sudo[120331]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:48 compute-1 ceph-mon[80126]: pgmap v187: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 09:59:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 09:59:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:48 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5be0009f20 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 09:59:48 compute-1 sudo[120456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cwrchlspdozncygpzuyxpcuxevwjqdtr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162387.9418838-910-166972745957666/AnsiballZ_copy.py'
Jan 23 09:59:48 compute-1 sudo[120456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:49 compute-1 python3.9[120458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162387.9418838-910-166972745957666/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:49 compute-1 sudo[120456]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:49.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[108466]: 23/01/2026 09:59:49 : epoch 69734637 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5bb4002690 fd 48 proxy ignored for local
Jan 23 09:59:49 compute-1 kernel: ganesha.nfsd[120334]: segfault at 50 ip 00007f5c62ec932e sp 00007f5bc6ffc210 error 4 in libntirpc.so.5.8[7f5c62eae000+2c000] likely on CPU 7 (core 0, socket 7)
Jan 23 09:59:49 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 09:59:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:49.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:49 compute-1 systemd[1]: Started Process Core Dump (PID 120519/UID 0).
Jan 23 09:59:49 compute-1 ceph-mon[80126]: pgmap v188: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:49 compute-1 sudo[120611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kefdgioqegmgshjtwbmhcbufvikdwuda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162389.2883737-958-113537180484281/AnsiballZ_file.py'
Jan 23 09:59:49 compute-1 sudo[120611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:49 compute-1 sudo[120612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 09:59:49 compute-1 sudo[120612]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:49 compute-1 sudo[120612]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:49 compute-1 python3.9[120624]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:49 compute-1 sudo[120611]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:50 compute-1 sudo[120788]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aufnbygieythnhmdyjsbagqnfqxggzeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162389.9972572-982-109459922440105/AnsiballZ_stat.py'
Jan 23 09:59:50 compute-1 sudo[120788]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:50 compute-1 python3.9[120790]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:50 compute-1 sudo[120788]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 09:59:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 09:59:50 compute-1 sudo[120911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yfnpsfkgnngdrksxmeioxhglfkorgvic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162389.9972572-982-109459922440105/AnsiballZ_copy.py'
Jan 23 09:59:50 compute-1 sudo[120911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:51 compute-1 systemd-coredump[120532]: Process 108470 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 59:
                                                    #0  0x00007f5c62ec932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 09:59:51 compute-1 python3.9[120913]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162389.9972572-982-109459922440105/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:51 compute-1 systemd[1]: systemd-coredump@3-120519-0.service: Deactivated successfully.
Jan 23 09:59:51 compute-1 systemd[1]: systemd-coredump@3-120519-0.service: Consumed 1.658s CPU time.
Jan 23 09:59:51 compute-1 sudo[120911]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:51.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:51 compute-1 podman[120918]: 2026-01-23 09:59:51.193207725 +0000 UTC m=+0.032825898 container died a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default)
Jan 23 09:59:51 compute-1 systemd[1]: var-lib-containers-storage-overlay-521ae91620297aecd98df56f95e4659091d7e86cf9085ee08042bb3dcd487913-merged.mount: Deactivated successfully.
Jan 23 09:59:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:51.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:51 compute-1 sudo[121084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lvnspxvbejgljbhlplinprrjexqdlwkg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162391.3237197-1030-103510490746747/AnsiballZ_file.py'
Jan 23 09:59:51 compute-1 sudo[121084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:51 compute-1 python3.9[121086]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 09:59:51 compute-1 sudo[121084]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 09:59:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Cumulative writes: 8279 writes, 33K keys, 8279 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s
                                           Cumulative WAL: 8279 writes, 1593 syncs, 5.20 writes per sync, written: 0.02 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 8279 writes, 33K keys, 8279 commit groups, 1.0 writes per commit group, ingest: 21.31 MB, 0.04 MB/s
                                           Interval WAL: 8279 writes, 1593 syncs, 5.20 writes per sync, written: 0.02 GB, 0.04 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 09:59:52 compute-1 podman[120918]: 2026-01-23 09:59:52.010458359 +0000 UTC m=+0.850076512 container remove a3906df4a161cfad19fcf9122a3a508eb6ec4a3892bfb83566fb080dfe48e4d3 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 09:59:52 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 09:59:52 compute-1 ceph-mon[80126]: pgmap v189: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 09:59:52 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 09:59:52 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.907s CPU time.
Jan 23 09:59:52 compute-1 sudo[121264]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yjzuqhwlnnfwuygccrwyegvpenjasgji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162391.9776301-1052-42453237798948/AnsiballZ_stat.py'
Jan 23 09:59:52 compute-1 sudo[121264]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:52 compute-1 python3.9[121266]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 09:59:52 compute-1 sudo[121264]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:52 compute-1 ceph-mon[80126]: pgmap v190: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 595 B/s wr, 1 op/s
Jan 23 09:59:52 compute-1 sudo[121387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cldrtdkcgpukwqkjjqscmdhhjoundqoz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162391.9776301-1052-42453237798948/AnsiballZ_copy.py'
Jan 23 09:59:52 compute-1 sudo[121387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 09:59:53 compute-1 python3.9[121389]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162391.9776301-1052-42453237798948/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=022ad0c65ad9b9ad4d20c21b3609f531109c55bd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 09:59:53 compute-1 sudo[121387]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:53.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:53.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:55.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 09:59:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:55.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:55 compute-1 ceph-mon[80126]: pgmap v191: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 595 B/s wr, 1 op/s
Jan 23 09:59:55 compute-1 sshd-session[115015]: Connection closed by 192.168.122.30 port 41940
Jan 23 09:59:55 compute-1 sshd-session[115012]: pam_unix(sshd:session): session closed for user zuul
Jan 23 09:59:55 compute-1 systemd[1]: session-47.scope: Deactivated successfully.
Jan 23 09:59:55 compute-1 systemd[1]: session-47.scope: Consumed 23.336s CPU time.
Jan 23 09:59:55 compute-1 systemd-logind[807]: Session 47 logged out. Waiting for processes to exit.
Jan 23 09:59:55 compute-1 systemd-logind[807]: Removed session 47.
Jan 23 09:59:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/095956 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 09:59:56 compute-1 ceph-mon[80126]: pgmap v192: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 595 B/s wr, 1 op/s
Jan 23 09:59:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:57.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 09:59:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:57.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 09:59:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 09:59:58 compute-1 sudo[121417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 09:59:58 compute-1 sudo[121417]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 09:59:58 compute-1 sudo[121417]: pam_unix(sudo:session): session closed for user root
Jan 23 09:59:59 compute-1 ceph-mon[80126]: pgmap v193: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 936 B/s wr, 3 op/s
Jan 23 09:59:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 09:59:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:09:59:59.167 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 09:59:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 09:59:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 09:59:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:09:59:59.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:00 compute-1 ceph-mon[80126]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:00:00 compute-1 ceph-mon[80126]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:00:00 compute-1 ceph-mon[80126]:      osd.1 observed slow operation indications in BlueStore
Jan 23 10:00:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:01.171 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100001 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:00:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:01.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:01 compute-1 ceph-mon[80126]: pgmap v194: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.4 KiB/s rd, 851 B/s wr, 2 op/s
Jan 23 10:00:02 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 4.
Jan 23 10:00:02 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:00:02 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.907s CPU time.
Jan 23 10:00:02 compute-1 sshd-session[121444]: Accepted publickey for zuul from 192.168.122.30 port 57652 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:00:02 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:00:02 compute-1 systemd-logind[807]: New session 48 of user zuul.
Jan 23 10:00:02 compute-1 systemd[1]: Started Session 48 of User zuul.
Jan 23 10:00:02 compute-1 sshd-session[121444]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:00:02 compute-1 podman[121547]: 2026-01-23 10:00:02.543877641 +0000 UTC m=+0.043287046 container create 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True)
Jan 23 10:00:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:00:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:00:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:00:02 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:00:02 compute-1 podman[121547]: 2026-01-23 10:00:02.608323159 +0000 UTC m=+0.107732564 container init 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, ceph=True)
Jan 23 10:00:02 compute-1 podman[121547]: 2026-01-23 10:00:02.523147662 +0000 UTC m=+0.022557077 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:00:02 compute-1 podman[121547]: 2026-01-23 10:00:02.620740487 +0000 UTC m=+0.120149872 container start 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:00:02 compute-1 bash[121547]: 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141
Jan 23 10:00:02 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:00:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:00:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:00:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:00:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:00:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:00:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:00:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:00:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:00:02 compute-1 ceph-mon[80126]: pgmap v195: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 851 B/s wr, 2 op/s
Jan 23 10:00:02 compute-1 sudo[121701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cznamubcarfimqukqkjkwurbmshjwgqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162402.4131775-22-194156931873612/AnsiballZ_file.py'
Jan 23 10:00:02 compute-1 sudo[121701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:03 compute-1 python3.9[121703]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:03 compute-1 sudo[121701]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:03.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:03.361 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:03 compute-1 sudo[121854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lsvqfozobhgbyojnhiogpkkwfntlttne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162403.3364599-58-95983260235889/AnsiballZ_stat.py'
Jan 23 10:00:03 compute-1 sudo[121854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:04 compute-1 python3.9[121856]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:04 compute-1 sudo[121854]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:04 compute-1 sudo[121977]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-einhijzkcaigowcsomsrdbyfjnkxuvhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162403.3364599-58-95983260235889/AnsiballZ_copy.py'
Jan 23 10:00:04 compute-1 sudo[121977]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:04 compute-1 python3.9[121979]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162403.3364599-58-95983260235889/.source.conf _original_basename=ceph.conf follow=False checksum=c8d90d44a83782ff84a3d797d09c3b204e2d1c61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:04 compute-1 sudo[121977]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:05 compute-1 sudo[122129]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vpcrbszkqjhzrtquortelfvczzwnntod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162404.8678617-58-255395524335640/AnsiballZ_stat.py'
Jan 23 10:00:05 compute-1 sudo[122129]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:05.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:05 compute-1 ceph-mon[80126]: pgmap v196: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Jan 23 10:00:05 compute-1 python3.9[122131]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:05.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:05 compute-1 sudo[122129]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:05 compute-1 sudo[122253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcgrpwqgahouooipaphgqayoxyvtqnqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162404.8678617-58-255395524335640/AnsiballZ_copy.py'
Jan 23 10:00:05 compute-1 sudo[122253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:05 compute-1 python3.9[122255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162404.8678617-58-255395524335640/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=a6273c4bda164a032598e5e81cbd7f6e9c0876d5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:05 compute-1 sudo[122253]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:00:06 compute-1 sshd-session[121450]: Connection closed by 192.168.122.30 port 57652
Jan 23 10:00:06 compute-1 sshd-session[121444]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:00:06 compute-1 systemd[1]: session-48.scope: Deactivated successfully.
Jan 23 10:00:06 compute-1 systemd[1]: session-48.scope: Consumed 2.587s CPU time.
Jan 23 10:00:06 compute-1 systemd-logind[807]: Session 48 logged out. Waiting for processes to exit.
Jan 23 10:00:06 compute-1 systemd-logind[807]: Removed session 48.
Jan 23 10:00:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:07.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:07.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:07 compute-1 ceph-mon[80126]: pgmap v197: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 852 B/s rd, 341 B/s wr, 1 op/s
Jan 23 10:00:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:08 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:00:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:08 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:00:08 compute-1 ceph-mon[80126]: pgmap v198: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 852 B/s wr, 2 op/s
Jan 23 10:00:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:09.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:09.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:11.188 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:11 compute-1 ceph-mon[80126]: pgmap v199: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Jan 23 10:00:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:11.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:12 compute-1 sshd-session[122283]: Accepted publickey for zuul from 192.168.122.30 port 46160 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:00:12 compute-1 systemd-logind[807]: New session 49 of user zuul.
Jan 23 10:00:12 compute-1 systemd[1]: Started Session 49 of User zuul.
Jan 23 10:00:12 compute-1 sshd-session[122283]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:00:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:13.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:13.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:13 compute-1 ceph-mon[80126]: pgmap v200: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:00:13 compute-1 python3.9[122437]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:00:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:00:14 compute-1 sudo[122605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jydumgiczbvabzwwumwwdseqwqyomrjj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162414.4353983-58-54901428081561/AnsiballZ_file.py'
Jan 23 10:00:14 compute-1 sudo[122605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:15 compute-1 python3.9[122607]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:15 compute-1 sudo[122605]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:15.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:15 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd448000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:15.383 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:15 compute-1 sudo[122759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aqdhnoiqszfkkrfcobygzidzukzwfkrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162415.2729914-58-192208141794376/AnsiballZ_file.py'
Jan 23 10:00:15 compute-1 sudo[122759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:15 compute-1 ceph-mon[80126]: pgmap v201: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:00:15 compute-1 python3.9[122761]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:15 compute-1 sudo[122759]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:16 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001030 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:16 compute-1 python3.9[122911]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:00:16 compute-1 ceph-mon[80126]: pgmap v202: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:00:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:16 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:17.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:17 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd448000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:17.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:17 compute-1 sudo[123062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xtsletbsjurjatjbfbyyenztyybhmevl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162417.0418906-127-221964146019122/AnsiballZ_seboolean.py'
Jan 23 10:00:17 compute-1 sudo[123062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:17 compute-1 python3.9[123064]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 10:00:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:18 compute-1 sudo[123065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:00:18 compute-1 sudo[123065]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:18 compute-1 sudo[123065]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100018 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:00:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:18 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:18 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:19 compute-1 sudo[123062]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:19.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:19 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:00:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:19.390 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:00:19 compute-1 ceph-mon[80126]: pgmap v203: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.4 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:00:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:20 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:20 compute-1 sudo[123245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txiiwplqxzyfgughhglswunswxvthczq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162420.2601779-157-141104705751769/AnsiballZ_setup.py'
Jan 23 10:00:20 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Jan 23 10:00:20 compute-1 sudo[123245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:20 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:20 compute-1 python3.9[123247]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 10:00:21 compute-1 sudo[123245]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:00:21 compute-1 ceph-mon[80126]: pgmap v204: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 511 B/s wr, 2 op/s
Jan 23 10:00:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:21.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:21 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:21.394 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:21 compute-1 sudo[123330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nkqsbvknsuvcsoosuftlxqiswftujlxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162420.2601779-157-141104705751769/AnsiballZ_dnf.py'
Jan 23 10:00:21 compute-1 sudo[123330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:21 compute-1 python3.9[123332]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:00:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:22 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:22 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:22 compute-1 ceph-mon[80126]: pgmap v205: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 511 B/s wr, 2 op/s
Jan 23 10:00:23 compute-1 sudo[123330]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:23.211 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:23 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:00:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:23.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:00:24 compute-1 sudo[123484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jluykfbouqkjbmdapvjybyuetpmqzcat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162423.6778286-193-174590305898751/AnsiballZ_systemd.py'
Jan 23 10:00:24 compute-1 sudo[123484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:24 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:24 compute-1 python3.9[123486]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:00:24 compute-1 sudo[123484]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:24 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:25.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:25 compute-1 ceph-mon[80126]: pgmap v206: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:00:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:25 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480021f0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:25.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:25 compute-1 sudo[123640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hltyiqfbhypablwxtkgyljorsedhmwsa ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162424.9721863-217-189147299352967/AnsiballZ_edpm_nftables_snippet.py'
Jan 23 10:00:25 compute-1 sudo[123640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:25 compute-1 python3[123642]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                             rule:
                                               proto: udp
                                               dport: 4789
                                           - rule_name: 119 neutron geneve networks
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               state: ["UNTRACKED"]
                                           - rule_name: 120 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: OUTPUT
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                           - rule_name: 121 neutron geneve networks no conntrack
                                             rule:
                                               proto: udp
                                               dport: 6081
                                               table: raw
                                               chain: PREROUTING
                                               jump: NOTRACK
                                               action: append
                                               state: []
                                            dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Jan 23 10:00:25 compute-1 sudo[123640]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:26 compute-1 sudo[123792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lbxpbskgtlmnctuttlntwntfklzsdqyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162425.9957285-244-81865873563127/AnsiballZ_file.py'
Jan 23 10:00:26 compute-1 sudo[123792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:26 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:26 compute-1 python3.9[123794]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:26 compute-1 sudo[123792]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:26 compute-1 ceph-mon[80126]: pgmap v207: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:00:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:26 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:27.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:27 compute-1 sudo[123944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oehrnyidnxjeumbbloayegwekigehnnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162426.8560328-268-34688344200508/AnsiballZ_stat.py'
Jan 23 10:00:27 compute-1 sudo[123944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:27 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:27.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:27 compute-1 python3.9[123946]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:27 compute-1 sudo[123944]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:27 compute-1 sudo[124023]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wjscayhewehiyvsbalmntmlbvvyvsrkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162426.8560328-268-34688344200508/AnsiballZ_file.py'
Jan 23 10:00:27 compute-1 sudo[124023]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:27 compute-1 python3.9[124025]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:28 compute-1 sudo[124023]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:28 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:28 compute-1 sudo[124175]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-axxohgibkgweejmhqonpuoaxvroqlxau ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162428.2316313-304-1012372570003/AnsiballZ_stat.py'
Jan 23 10:00:28 compute-1 sudo[124175]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:28 compute-1 python3.9[124177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:28 compute-1 sudo[124175]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:28 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:28 compute-1 sudo[124253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jkefuszznjcojzhzlrqzbfgjxkvkkmus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162428.2316313-304-1012372570003/AnsiballZ_file.py'
Jan 23 10:00:28 compute-1 sudo[124253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:29 compute-1 python3.9[124255]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.lbe55_d8 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:29.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:29 compute-1 sudo[124253]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:29 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:29.409 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:29 compute-1 ceph-mon[80126]: pgmap v208: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:00:29 compute-1 sudo[124406]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-chwwkmwrnfffgwwjwghxhyrphmfwhnku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162429.4373724-340-148769435471851/AnsiballZ_stat.py'
Jan 23 10:00:29 compute-1 sudo[124406]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:30 compute-1 python3.9[124408]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:30 compute-1 sudo[124406]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:30 compute-1 sudo[124484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-frmhugkcsoresrvmcyjfxjwgntzohhyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162429.4373724-340-148769435471851/AnsiballZ_file.py'
Jan 23 10:00:30 compute-1 sudo[124484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:30 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:30 compute-1 python3.9[124486]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:30 compute-1 sudo[124484]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:30 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4480095a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:31.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:31 compute-1 sudo[124637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrnsntdmqknryxlreupbwbjatprrhvqt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162430.8780055-379-278550427280396/AnsiballZ_command.py'
Jan 23 10:00:31 compute-1 sudo[124637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:31 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:31.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:31 compute-1 python3.9[124639]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:31 compute-1 sudo[124637]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:31 compute-1 ceph-mon[80126]: pgmap v209: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:32 compute-1 sudo[124790]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jjmabxtmjlcqyrmdpcoznjpdazpnqiec ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162431.8215241-403-174541828696870/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 10:00:32 compute-1 sudo[124790]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:32 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:32 compute-1 python3[124792]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 10:00:32 compute-1 sudo[124790]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:32 compute-1 ceph-mon[80126]: pgmap v210: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:32 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:33 compute-1 sudo[124942]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xeoxuwtfowgeujtawdncqxmihhhjtyyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162432.7276552-427-116407109939417/AnsiballZ_stat.py'
Jan 23 10:00:33 compute-1 sudo[124942]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:33.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:33 compute-1 python3.9[124944]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:33 compute-1 sudo[124942]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:33 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:33.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:33 compute-1 sudo[125068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kriumdxwzqoajwmflrzmyljynxtrhckm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162432.7276552-427-116407109939417/AnsiballZ_copy.py'
Jan 23 10:00:33 compute-1 sudo[125068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:33 compute-1 python3.9[125070]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162432.7276552-427-116407109939417/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:34 compute-1 sudo[125068]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:34 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:34 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:34 compute-1 sudo[125220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpmpohvxmlnwwtqvxalbhmuhvywlphyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162434.2599027-472-142941938339080/AnsiballZ_stat.py'
Jan 23 10:00:34 compute-1 sudo[125220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:35 compute-1 python3.9[125222]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:35 compute-1 sudo[125220]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 10:00:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:35.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 10:00:35 compute-1 ceph-mon[80126]: pgmap v211: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:00:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:35 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:35.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:35 compute-1 sudo[125346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rrfxttgcuwawgfyuulutdobwftixjdcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162434.2599027-472-142941938339080/AnsiballZ_copy.py'
Jan 23 10:00:35 compute-1 sudo[125346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:35 compute-1 python3.9[125348]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162434.2599027-472-142941938339080/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:35 compute-1 sudo[125346]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:36 compute-1 sudo[125498]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ktymlcneinxooaaobpdomkswkhazqycx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162435.9345121-517-246391593233414/AnsiballZ_stat.py'
Jan 23 10:00:36 compute-1 sudo[125498]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:36 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:36 compute-1 python3.9[125500]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:36 compute-1 sudo[125498]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:36 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:36 compute-1 sudo[125623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nevvezvhfjhsmnwwubwrqzzytwkybbes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162435.9345121-517-246391593233414/AnsiballZ_copy.py'
Jan 23 10:00:36 compute-1 sudo[125623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:37 compute-1 python3.9[125625]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162435.9345121-517-246391593233414/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:37 compute-1 sudo[125623]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:37.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:37 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:37.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:37 compute-1 ceph-mon[80126]: pgmap v212: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:37 compute-1 sudo[125776]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcvrhrsgefjpahzlzdxdcvzhvgwcqmhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162437.4209535-562-204396651707816/AnsiballZ_stat.py'
Jan 23 10:00:37 compute-1 sudo[125776]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:37 compute-1 python3.9[125778]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:37 compute-1 sudo[125776]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:38 compute-1 sudo[125852]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:00:38 compute-1 sudo[125852]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:38 compute-1 sudo[125852]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:38 compute-1 sudo[125926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suuskbwtfokhhpadiqslmlhogflvqari ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162437.4209535-562-204396651707816/AnsiballZ_copy.py'
Jan 23 10:00:38 compute-1 sudo[125926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:38 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:38 compute-1 python3.9[125928]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162437.4209535-562-204396651707816/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:38 compute-1 sudo[125926]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:38 compute-1 ceph-mon[80126]: pgmap v213: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:00:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:38 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:39 compute-1 sudo[126078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzzqnuurgkfktlvnqmpyluapcgvelljo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162438.7094913-607-232059713954712/AnsiballZ_stat.py'
Jan 23 10:00:39 compute-1 sudo[126078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:00:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:39.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:00:39 compute-1 python3.9[126080]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:39 compute-1 sudo[126078]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:39 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:39.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:39 compute-1 sudo[126204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hnndqntvxptgpdemreuasykhcuxtkkeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162438.7094913-607-232059713954712/AnsiballZ_copy.py'
Jan 23 10:00:39 compute-1 sudo[126204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:39 compute-1 python3.9[126206]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162438.7094913-607-232059713954712/.source.nft follow=False _original_basename=ruleset.j2 checksum=bdba38546f86123f1927359d89789bd211aba99d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:39 compute-1 sudo[126204]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:40 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438001d40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:40 compute-1 sudo[126356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufkyficyhdhkbyjyrapbwnpfskiwusyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162440.1612785-652-75282146536474/AnsiballZ_file.py'
Jan 23 10:00:40 compute-1 sudo[126356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:40 compute-1 python3.9[126358]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:40 compute-1 sudo[126356]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:40 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:41 compute-1 sudo[126508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ynaodibvypyzvgbidgsvrgnrdtcnpimz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162440.905978-676-105963283203760/AnsiballZ_command.py'
Jan 23 10:00:41 compute-1 sudo[126508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:41.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:41 compute-1 ceph-mon[80126]: pgmap v214: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:41 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:41 compute-1 python3.9[126510]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:41.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:41 compute-1 sudo[126508]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:42 compute-1 sudo[126664]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-figskatcrlllcwncoixmylkwatutncrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162441.6855493-700-27636074839256/AnsiballZ_blockinfile.py'
Jan 23 10:00:42 compute-1 sudo[126664]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:42 compute-1 python3.9[126666]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:42 compute-1 sudo[126664]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:42 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:42 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:42 compute-1 sudo[126816]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-epkuanbvoawqsoyzfxtqzvoegslgqxll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162442.6000526-727-181693321205510/AnsiballZ_command.py'
Jan 23 10:00:42 compute-1 sudo[126816]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:43 compute-1 python3.9[126818]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:43 compute-1 sudo[126816]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:43.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:43 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:43.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:43 compute-1 ceph-mon[80126]: pgmap v215: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:43 compute-1 sudo[126970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nxfkjutgovyniafziebmxftcdczcrpgn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162443.3196387-751-72227599370341/AnsiballZ_stat.py'
Jan 23 10:00:43 compute-1 sudo[126970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:44 compute-1 python3.9[126972]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:00:44 compute-1 sudo[126970]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:44 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:44 compute-1 sudo[127124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lkkmbdmihcqbqdlbodjnuammogdhhthu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162444.2329714-775-206967877569994/AnsiballZ_command.py'
Jan 23 10:00:44 compute-1 sudo[127124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:44 compute-1 python3.9[127126]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:44 compute-1 sudo[127124]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:44 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:45 compute-1 ceph-mon[80126]: pgmap v216: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:45 compute-1 sudo[127279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lrskvcrzqciogkwamdiencnkaaevlnpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162444.9578788-799-218406976051030/AnsiballZ_file.py'
Jan 23 10:00:45 compute-1 sudo[127279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:45.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:45 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd440002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:45 compute-1 python3.9[127281]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:45.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:45 compute-1 sudo[127279]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:45 compute-1 ceph-osd[77616]: bluestore.MempoolThread fragmentation_score=0.000033 took=0.000313s
Jan 23 10:00:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:46 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:46 compute-1 python3.9[127432]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:00:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:46 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:47.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:47 compute-1 ceph-mon[80126]: pgmap v217: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:47 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c0013a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:00:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Cumulative writes: 2156 writes, 13K keys, 2156 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.06 MB/s
                                           Cumulative WAL: 2156 writes, 2156 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 2156 writes, 13K keys, 2156 commit groups, 1.0 writes per commit group, ingest: 36.35 MB, 0.06 MB/s
                                           Interval WAL: 2156 writes, 2156 syncs, 1.00 writes per sync, written: 0.04 GB, 0.06 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     38.9      0.53              0.06         6    0.088       0      0       0.0       0.0
                                             L6      1/0   12.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   3.0    125.0    110.6      0.55              0.18         5    0.111     22K   2298       0.0       0.0
                                            Sum      1/0   12.61 MB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0     64.0     75.6      1.08              0.25        11    0.099     22K   2298       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   4.0     64.1     75.8      1.08              0.25        10    0.108     22K   2298       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   0.0    125.0    110.6      0.55              0.18         5    0.111     22K   2298       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     39.1      0.53              0.06         5    0.105       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 600.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.020, interval 0.020
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.1 seconds
                                           Interval compaction: 0.08 GB write, 0.14 MB/s write, 0.07 GB read, 0.12 MB/s read, 1.1 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 1.47 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000161 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(86,1.26 MB,0.414803%) FilterBlock(11,73.42 KB,0.0235859%) IndexBlock(11,142.14 KB,0.0456609%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:00:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:47.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:47 compute-1 sudo[127586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdlaujzncoxvusopotxikcmtfnbgrtmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162447.5153372-919-142659340215186/AnsiballZ_command.py'
Jan 23 10:00:47 compute-1 sudo[127586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:48 compute-1 python3.9[127588]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:8d:1d:08:09" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:48 compute-1 ovs-vsctl[127589]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-1.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:8d:1d:08:09 external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Jan 23 10:00:48 compute-1 sudo[127586]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:48 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:48 compute-1 sudo[127739]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jyhenatslkkehvfzdvijtjlenquauvsx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162448.5481334-946-166289547837644/AnsiballZ_command.py'
Jan 23 10:00:48 compute-1 sudo[127739]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:48 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:49 compute-1 python3.9[127741]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ovs-vsctl show | grep -q "Manager"
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:49 compute-1 sudo[127739]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:49.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:49 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:49.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:49 compute-1 sudo[127895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bgvlysftmmxpbisctvxofbpekgjbocyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162449.2699132-970-252530133632534/AnsiballZ_command.py'
Jan 23 10:00:49 compute-1 sudo[127895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:49 compute-1 python3.9[127897]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:00:49 compute-1 ovs-vsctl[127899]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Jan 23 10:00:49 compute-1 sudo[127895]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:49 compute-1 sudo[127898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:00:49 compute-1 sudo[127898]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:49 compute-1 sudo[127898]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:49 compute-1 sudo[127948]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:00:49 compute-1 sudo[127948]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:50 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:50 compute-1 sudo[127948]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:50 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:51.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:51 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:51.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:52 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:52 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:53.264 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:53 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:53.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:53 compute-1 ceph-mds[84630]: mds.beacon.cephfs.compute-1.bcvzvj missed beacon ack from the monitors
Jan 23 10:00:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:54 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:54 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:55.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:55 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:55.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:55 compute-1 ceph-mon[80126]: pgmap v218: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:00:55 compute-1 python3.9[128131]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:00:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:56 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c001eb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:00:56 compute-1 ceph-mon[80126]: pgmap v219: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:56 compute-1 ceph-mon[80126]: pgmap v220: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:56 compute-1 ceph-mon[80126]: pgmap v221: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:00:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:00:56 compute-1 sudo[128283]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rbnwfqelpsadffoenfjoxgnxhooymdfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162456.2714489-1021-271589035590789/AnsiballZ_file.py'
Jan 23 10:00:56 compute-1 sudo[128283]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:56 compute-1 python3.9[128285]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:56 compute-1 sudo[128283]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:56 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:00:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:57.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:00:57 compute-1 sudo[128435]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zndvlaktosyokuyaxxjhghkkffhzsfdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162457.0481977-1045-135765037521660/AnsiballZ_stat.py'
Jan 23 10:00:57 compute-1 sudo[128435]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:57 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:57.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:57 compute-1 ceph-mon[80126]: pgmap v222: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:00:57 compute-1 python3.9[128438]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:57 compute-1 sudo[128435]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:00:57 compute-1 sudo[128514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ekjyuctzpczsshngmtzojssemkldlwtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162457.0481977-1045-135765037521660/AnsiballZ_file.py'
Jan 23 10:00:57 compute-1 sudo[128514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:58 compute-1 python3.9[128516]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:58 compute-1 sudo[128514]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:58 compute-1 sudo[128568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:00:58 compute-1 sudo[128568]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:00:58 compute-1 sudo[128568]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:58 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:58 compute-1 sudo[128691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwgmvnigigvdupciizpwpoptzikelwvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162458.1879349-1045-274773277959518/AnsiballZ_stat.py'
Jan 23 10:00:58 compute-1 sudo[128691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:58 compute-1 ceph-mon[80126]: pgmap v223: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:00:58 compute-1 python3.9[128693]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:00:58 compute-1 sudo[128691]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:58 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:58 compute-1 sudo[128769]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jtkgnyqijibirrnggsxesrzyiylmzdtg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162458.1879349-1045-274773277959518/AnsiballZ_file.py'
Jan 23 10:00:58 compute-1 sudo[128769]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:59 compute-1 python3.9[128771]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:00:59 compute-1 sudo[128769]: pam_unix(sudo:session): session closed for user root
Jan 23 10:00:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:00:59.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:00:59 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:00:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:00:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:00:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:00:59.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:00:59 compute-1 sudo[128922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rgmezxkjlsebnzicjcvmxiwhwxdtyroc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162459.403726-1114-189890457626257/AnsiballZ_file.py'
Jan 23 10:00:59 compute-1 sudo[128922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:00:59 compute-1 python3.9[128924]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:00:59 compute-1 sudo[128922]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:00 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:00 compute-1 sudo[129074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tkfdxlzrjmwyjcgwjvgfilltysoojncm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162460.0953758-1138-238363669981868/AnsiballZ_stat.py'
Jan 23 10:01:00 compute-1 sudo[129074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:00 compute-1 python3.9[129076]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:00 compute-1 sudo[129074]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:00 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:00 compute-1 sudo[129152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmiktbventczekwvdmpoatgcgzebxoos ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162460.0953758-1138-238363669981868/AnsiballZ_file.py'
Jan 23 10:01:00 compute-1 sudo[129152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:01 compute-1 python3.9[129154]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:01 compute-1 sudo[129152]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:01 compute-1 ceph-mon[80126]: pgmap v224: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:01 compute-1 CROND[129156]: (root) CMD (run-parts /etc/cron.hourly)
Jan 23 10:01:01 compute-1 run-parts[129159]: (/etc/cron.hourly) starting 0anacron
Jan 23 10:01:01 compute-1 run-parts[129165]: (/etc/cron.hourly) finished 0anacron
Jan 23 10:01:01 compute-1 CROND[129155]: (root) CMDEND (run-parts /etc/cron.hourly)
Jan 23 10:01:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:01.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:01 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:01.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:01 compute-1 sudo[129316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wohwcpuzbwqtmkdurtzayvbesagfnnxd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162461.5651305-1174-252896891114414/AnsiballZ_stat.py'
Jan 23 10:01:01 compute-1 sudo[129316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:02 compute-1 python3.9[129318]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:02 compute-1 sudo[129316]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:02 compute-1 sudo[129394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxbkgypweutzhietrlftjcxzpddboxot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162461.5651305-1174-252896891114414/AnsiballZ_file.py'
Jan 23 10:01:02 compute-1 sudo[129394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:02 compute-1 python3.9[129396]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:02 compute-1 sudo[129394]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:02 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:02 compute-1 ceph-mon[80126]: pgmap v225: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:03 compute-1 sudo[129546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xvwgolfrbxcrkooznqlzdzfqqzevumtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162462.8066359-1210-60843685431198/AnsiballZ_systemd.py'
Jan 23 10:01:03 compute-1 sudo[129546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:03.282 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:03 compute-1 python3.9[129548]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:01:03 compute-1 systemd[1]: Reloading.
Jan 23 10:01:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:03 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:03.467 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:03 compute-1 systemd-rc-local-generator[129581]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:03 compute-1 systemd-sysv-generator[129585]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:03 compute-1 sudo[129546]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:04 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:04 compute-1 sudo[129688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:01:04 compute-1 sudo[129688]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:01:04 compute-1 sudo[129688]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:04 compute-1 sudo[129763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lwveslmxeqkujjhcemfgudxcuejwdxvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162464.4695606-1234-31302388234212/AnsiballZ_stat.py'
Jan 23 10:01:04 compute-1 sudo[129763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:04 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:04 compute-1 python3.9[129765]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:05 compute-1 sudo[129763]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:05 compute-1 sudo[129841]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaygawzxawpalesbxjrrbnxrtnlbyrbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162464.4695606-1234-31302388234212/AnsiballZ_file.py'
Jan 23 10:01:05 compute-1 sudo[129841]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:05.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:05 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:05.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:05 compute-1 ceph-mon[80126]: pgmap v226: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:01:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:01:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:01:05 compute-1 python3.9[129843]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:05 compute-1 sudo[129841]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:06 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:06 compute-1 sudo[129994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uixvfvppenovwzimjivthvnkygnjvraw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162466.1622903-1270-162649847465136/AnsiballZ_stat.py'
Jan 23 10:01:06 compute-1 sudo[129994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:06 compute-1 ceph-mon[80126]: pgmap v227: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:06 compute-1 python3.9[129996]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:06 compute-1 sudo[129994]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:06 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:06 compute-1 sudo[130072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-onajgmqfdesebqoiswheuiqjpauklbtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162466.1622903-1270-162649847465136/AnsiballZ_file.py'
Jan 23 10:01:06 compute-1 sudo[130072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:07 compute-1 python3.9[130074]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:07 compute-1 sudo[130072]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:07.289 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:07 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44800a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:07.473 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:07 compute-1 sudo[130225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fibpbxylhwtyzjkaqccrgfbqcmngtwac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162467.342449-1306-75811744790075/AnsiballZ_systemd.py'
Jan 23 10:01:07 compute-1 sudo[130225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:07 compute-1 python3.9[130227]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:01:07 compute-1 systemd[1]: Reloading.
Jan 23 10:01:08 compute-1 systemd-sysv-generator[130257]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:08 compute-1 systemd-rc-local-generator[130254]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:08 compute-1 systemd[1]: Starting Create netns directory...
Jan 23 10:01:08 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 10:01:08 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 10:01:08 compute-1 systemd[1]: Finished Create netns directory.
Jan 23 10:01:08 compute-1 sudo[130225]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:08 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:08 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:09 compute-1 sudo[130419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cxagosxmrfyiegvgtygngwgzienosqgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162468.816853-1336-28871860003272/AnsiballZ_file.py'
Jan 23 10:01:09 compute-1 sudo[130419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:09.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:09 compute-1 python3.9[130421]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:09 compute-1 sudo[130419]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:09 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:09.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:09 compute-1 ceph-mon[80126]: pgmap v228: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:01:10 compute-1 sudo[130572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gtobhovthtvslrwlxyvyhrvuxftbkdac ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162469.891899-1360-200251991842722/AnsiballZ_stat.py'
Jan 23 10:01:10 compute-1 sudo[130572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:10 compute-1 python3.9[130574]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:10 compute-1 sudo[130572]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:10 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:10 compute-1 ceph-mon[80126]: pgmap v229: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:10 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:10 compute-1 sudo[130696]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtluzdstrlpekunwnpqnchkijguynxzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162469.891899-1360-200251991842722/AnsiballZ_copy.py'
Jan 23 10:01:10 compute-1 sudo[130696]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:11 compute-1 python3.9[130698]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162469.891899-1360-200251991842722/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:11 compute-1 sudo[130696]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:11.296 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:11 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd424000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:11.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:11 compute-1 sudo[130849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zlqosofrrqyxduazdoskmmsqweaglqiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162471.6374297-1412-254959692362859/AnsiballZ_file.py'
Jan 23 10:01:11 compute-1 sudo[130849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:12 compute-1 python3.9[130851]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:12 compute-1 sudo[130849]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:12 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:12 compute-1 sudo[131001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pdkquduphvcieivgdxnyjwpvgppvbmpv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162472.3475428-1435-140985487133704/AnsiballZ_file.py'
Jan 23 10:01:12 compute-1 sudo[131001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:12 compute-1 python3.9[131003]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:12 compute-1 sudo[131001]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:12 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd44c004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:13.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:13 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:13 compute-1 sudo[131154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hjzxosyfvymachdeimjrxrlzagzkjmjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162473.1772597-1459-219667465065434/AnsiballZ_stat.py'
Jan 23 10:01:13 compute-1 sudo[131154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:13.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:13 compute-1 ceph-mon[80126]: pgmap v230: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:13 compute-1 python3.9[131156]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:13 compute-1 sudo[131154]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:14 compute-1 sudo[131277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scszlazznlwfouxujlntibiovoycjmnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162473.1772597-1459-219667465065434/AnsiballZ_copy.py'
Jan 23 10:01:14 compute-1 sudo[131277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:14 compute-1 python3.9[131279]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162473.1772597-1459-219667465065434/.source.json _original_basename=.3iy7p8yy follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:14 compute-1 sudo[131277]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:14 compute-1 ceph-mon[80126]: pgmap v231: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:14 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:14 compute-1 python3.9[131429]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:15.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:15 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:15.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:16 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd438003a10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:16 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd4240016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:17 compute-1 ceph-mon[80126]: pgmap v232: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:17.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:17 compute-1 sudo[131851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwmwzgbexhuvoyngvlwapqqwpoouufmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162476.8912537-1579-57360102719021/AnsiballZ_container_config_data.py'
Jan 23 10:01:17 compute-1 sudo[131851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:17 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:17.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:17 compute-1 python3.9[131853]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Jan 23 10:01:17 compute-1 sudo[131851]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:18 compute-1 sudo[131954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:01:18 compute-1 sudo[131954]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:01:18 compute-1 sudo[131954]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:18 compute-1 kernel: ganesha.nfsd[122702]: segfault at 50 ip 00007fd4d441432e sp 00007fd437ffe210 error 4 in libntirpc.so.5.8[7fd4d43f9000+2c000] likely on CPU 3 (core 0, socket 3)
Jan 23 10:01:18 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:01:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[121562]: 23/01/2026 10:01:18 : epoch 697346a2 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fd42c003db0 fd 39 proxy ignored for local
Jan 23 10:01:18 compute-1 sudo[132029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hifegakajpiwkhttanedahlrbbezsvqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162477.9844294-1612-65624830178356/AnsiballZ_container_config_hash.py'
Jan 23 10:01:18 compute-1 sudo[132029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:18 compute-1 systemd[1]: Started Process Core Dump (PID 132031/UID 0).
Jan 23 10:01:18 compute-1 python3.9[132032]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 10:01:18 compute-1 sudo[132029]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:19.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:19.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:19 compute-1 systemd-coredump[132033]: Process 121566 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 54:
                                                    #0  0x00007fd4d441432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007fd4d441e900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:01:19 compute-1 systemd[1]: systemd-coredump@4-132031-0.service: Deactivated successfully.
Jan 23 10:01:19 compute-1 systemd[1]: systemd-coredump@4-132031-0.service: Consumed 1.256s CPU time.
Jan 23 10:01:19 compute-1 podman[132151]: 2026-01-23 10:01:19.876807411 +0000 UTC m=+0.052085632 container died 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.40.1, OSD_FLAVOR=default)
Jan 23 10:01:19 compute-1 sudo[132199]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jspeaewcseovztsoogqyzhfswumpnrqe ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162479.2705708-1642-58085836629404/AnsiballZ_edpm_container_manage.py'
Jan 23 10:01:19 compute-1 sudo[132199]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:19 compute-1 systemd[1]: var-lib-containers-storage-overlay-916d97ef95f49e0b264fd23d2500bc12f9fe3eb7ffc0aeaeaa0b9e29bbe55e9f-merged.mount: Deactivated successfully.
Jan 23 10:01:19 compute-1 podman[132151]: 2026-01-23 10:01:19.920207198 +0000 UTC m=+0.095485399 container remove 44cfab1fbb8dfbc712e8cb29734e6fc390c2c24ea4ac763ca054605132305141 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0)
Jan 23 10:01:19 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:01:19 compute-1 ceph-mon[80126]: pgmap v233: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:01:20 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 10:01:20 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.798s CPU time.
Jan 23 10:01:20 compute-1 python3[132201]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 10:01:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:21.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:01:21 compute-1 ceph-mon[80126]: pgmap v234: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:21.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:22 compute-1 ceph-mon[80126]: pgmap v235: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:23.318 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:23.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100124 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:01:24 compute-1 ceph-mon[80126]: pgmap v236: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:25 compute-1 podman[132247]: 2026-01-23 10:01:25.049347082 +0000 UTC m=+4.817359095 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 10:01:25 compute-1 podman[132371]: 2026-01-23 10:01:25.211817725 +0000 UTC m=+0.058488966 container create 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 10:01:25 compute-1 podman[132371]: 2026-01-23 10:01:25.179732147 +0000 UTC m=+0.026403468 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 10:01:25 compute-1 python3[132201]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de
Jan 23 10:01:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:25.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:25 compute-1 sudo[132199]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:25.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:27 compute-1 ceph-mon[80126]: pgmap v237: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:01:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:27.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:27.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:29.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:29 compute-1 sudo[132562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uzmndqlmkwsjberqjoxttahtnqevbmft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162489.0791655-1666-206185604796202/AnsiballZ_stat.py'
Jan 23 10:01:29 compute-1 sudo[132562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:29.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:29 compute-1 python3.9[132564]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:01:29 compute-1 sudo[132562]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:29 compute-1 ceph-mon[80126]: pgmap v238: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:01:30 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 5.
Jan 23 10:01:30 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:01:30 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.798s CPU time.
Jan 23 10:01:30 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:01:30 compute-1 sudo[132729]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lclnimbtzzdywbonjfheaqplhzyrjxtb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.0122082-1693-178078190709328/AnsiballZ_file.py'
Jan 23 10:01:30 compute-1 sudo[132729]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:30 compute-1 podman[132765]: 2026-01-23 10:01:30.470653972 +0000 UTC m=+0.064323512 container create 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2)
Jan 23 10:01:30 compute-1 python3.9[132732]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:30 compute-1 sudo[132729]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:30 compute-1 podman[132765]: 2026-01-23 10:01:30.435531968 +0000 UTC m=+0.029201508 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:01:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:01:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:01:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:01:30 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:01:30 compute-1 podman[132765]: 2026-01-23 10:01:30.551416024 +0000 UTC m=+0.145085564 container init 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:01:30 compute-1 podman[132765]: 2026-01-23 10:01:30.558134756 +0000 UTC m=+0.151804266 container start 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325)
Jan 23 10:01:30 compute-1 bash[132765]: 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc
Jan 23 10:01:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:01:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:01:30 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:01:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:01:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:01:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:01:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:01:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:01:30 compute-1 sudo[132895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ywyfmechobixtnhodslxymriwsporsom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.0122082-1693-178078190709328/AnsiballZ_stat.py'
Jan 23 10:01:30 compute-1 sudo[132895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:01:30 compute-1 python3.9[132897]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:01:30 compute-1 sudo[132895]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:30 compute-1 ceph-mon[80126]: pgmap v239: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:01:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:31.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:31 compute-1 sudo[133047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lambvlnzgodbaydorcafsfrxzchlybcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.983635-1693-194799596292884/AnsiballZ_copy.py'
Jan 23 10:01:31 compute-1 sudo[133047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:31.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:31 compute-1 python3.9[133049]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769162490.983635-1693-194799596292884/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:31 compute-1 sudo[133047]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:31 compute-1 sudo[133123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afldyzfzugsedicezmpgxgvutejcvuzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.983635-1693-194799596292884/AnsiballZ_systemd.py'
Jan 23 10:01:31 compute-1 sudo[133123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:32 compute-1 python3.9[133125]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:01:32 compute-1 systemd[1]: Reloading.
Jan 23 10:01:32 compute-1 systemd-rc-local-generator[133150]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:32 compute-1 systemd-sysv-generator[133156]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:32 compute-1 sudo[133123]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:32 compute-1 sudo[133233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tzhrmrkxzsnlarywesqnzaxgfquwlwkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162490.983635-1693-194799596292884/AnsiballZ_systemd.py'
Jan 23 10:01:32 compute-1 sudo[133233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:33 compute-1 python3.9[133235]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:01:33 compute-1 systemd[1]: Reloading.
Jan 23 10:01:33 compute-1 systemd-sysv-generator[133270]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:33 compute-1 systemd-rc-local-generator[133266]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:33.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:33 compute-1 systemd[1]: Starting ovn_controller container...
Jan 23 10:01:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:33.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:33 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:01:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/43da7c3110500b2184bcac9d202c790e74899d463bae53ae641d21dda7a79896/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Jan 23 10:01:33 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162.
Jan 23 10:01:33 compute-1 podman[133278]: 2026-01-23 10:01:33.622424859 +0000 UTC m=+0.136637926 container init 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:01:33 compute-1 ovn_controller[133293]: + sudo -E kolla_set_configs
Jan 23 10:01:33 compute-1 podman[133278]: 2026-01-23 10:01:33.650281122 +0000 UTC m=+0.164494179 container start 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:01:33 compute-1 systemd[1]: Created slice User Slice of UID 0.
Jan 23 10:01:33 compute-1 systemd[1]: Starting User Runtime Directory /run/user/0...
Jan 23 10:01:33 compute-1 edpm-start-podman-container[133278]: ovn_controller
Jan 23 10:01:33 compute-1 systemd[1]: Finished User Runtime Directory /run/user/0.
Jan 23 10:01:33 compute-1 systemd[1]: Starting User Manager for UID 0...
Jan 23 10:01:33 compute-1 systemd[133317]: pam_unix(systemd-user:session): session opened for user root(uid=0) by root(uid=0)
Jan 23 10:01:33 compute-1 edpm-start-podman-container[133277]: Creating additional drop-in dependency for "ovn_controller" (56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162)
Jan 23 10:01:33 compute-1 systemd[1]: Reloading.
Jan 23 10:01:33 compute-1 podman[133300]: 2026-01-23 10:01:33.783458366 +0000 UTC m=+0.123675004 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true)
Jan 23 10:01:33 compute-1 systemd[133317]: Queued start job for default target Main User Target.
Jan 23 10:01:33 compute-1 systemd[133317]: Created slice User Application Slice.
Jan 23 10:01:33 compute-1 systemd[133317]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Jan 23 10:01:33 compute-1 systemd[133317]: Started Daily Cleanup of User's Temporary Directories.
Jan 23 10:01:33 compute-1 systemd[133317]: Reached target Paths.
Jan 23 10:01:33 compute-1 systemd[133317]: Reached target Timers.
Jan 23 10:01:33 compute-1 systemd[133317]: Starting D-Bus User Message Bus Socket...
Jan 23 10:01:33 compute-1 systemd[133317]: Starting Create User's Volatile Files and Directories...
Jan 23 10:01:33 compute-1 systemd[133317]: Finished Create User's Volatile Files and Directories.
Jan 23 10:01:33 compute-1 systemd[133317]: Listening on D-Bus User Message Bus Socket.
Jan 23 10:01:33 compute-1 systemd[133317]: Reached target Sockets.
Jan 23 10:01:33 compute-1 systemd[133317]: Reached target Basic System.
Jan 23 10:01:33 compute-1 systemd[133317]: Reached target Main User Target.
Jan 23 10:01:33 compute-1 systemd[133317]: Startup finished in 135ms.
Jan 23 10:01:33 compute-1 systemd-rc-local-generator[133380]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:01:33 compute-1 systemd-sysv-generator[133384]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:01:34 compute-1 systemd[1]: Started User Manager for UID 0.
Jan 23 10:01:34 compute-1 systemd[1]: Started ovn_controller container.
Jan 23 10:01:34 compute-1 systemd[1]: 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162-7ccd2d264f17c79c.service: Main process exited, code=exited, status=1/FAILURE
Jan 23 10:01:34 compute-1 systemd[1]: 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162-7ccd2d264f17c79c.service: Failed with result 'exit-code'.
Jan 23 10:01:34 compute-1 systemd[1]: Started Session c1 of User root.
Jan 23 10:01:34 compute-1 sudo[133233]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:34 compute-1 ovn_controller[133293]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 10:01:34 compute-1 ovn_controller[133293]: INFO:__main__:Validating config file
Jan 23 10:01:34 compute-1 ovn_controller[133293]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 10:01:34 compute-1 ovn_controller[133293]: INFO:__main__:Writing out command to execute
Jan 23 10:01:34 compute-1 systemd[1]: session-c1.scope: Deactivated successfully.
Jan 23 10:01:34 compute-1 ovn_controller[133293]: ++ cat /run_command
Jan 23 10:01:34 compute-1 ovn_controller[133293]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 10:01:34 compute-1 ovn_controller[133293]: + ARGS=
Jan 23 10:01:34 compute-1 ovn_controller[133293]: + sudo kolla_copy_cacerts
Jan 23 10:01:34 compute-1 systemd[1]: Started Session c2 of User root.
Jan 23 10:01:34 compute-1 ovn_controller[133293]: + [[ ! -n '' ]]
Jan 23 10:01:34 compute-1 ovn_controller[133293]: + . kolla_extend_start
Jan 23 10:01:34 compute-1 ovn_controller[133293]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Jan 23 10:01:34 compute-1 ovn_controller[133293]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Jan 23 10:01:34 compute-1 ovn_controller[133293]: + umask 0022
Jan 23 10:01:34 compute-1 ovn_controller[133293]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Jan 23 10:01:34 compute-1 systemd[1]: session-c2.scope: Deactivated successfully.
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Jan 23 10:01:34 compute-1 NetworkManager[48978]: <info>  [1769162494.2257] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/16)
Jan 23 10:01:34 compute-1 NetworkManager[48978]: <info>  [1769162494.2264] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 10:01:34 compute-1 NetworkManager[48978]: <warn>  [1769162494.2267] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 10:01:34 compute-1 NetworkManager[48978]: <info>  [1769162494.2274] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Jan 23 10:01:34 compute-1 NetworkManager[48978]: <info>  [1769162494.2279] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/18)
Jan 23 10:01:34 compute-1 NetworkManager[48978]: <info>  [1769162494.2282] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 10:01:34 compute-1 kernel: br-int: entered promiscuous mode
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00014|main|INFO|OVS feature set changed, force recompute.
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00020|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00021|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00022|main|INFO|OVS feature set changed, force recompute.
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00023|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Jan 23 10:01:34 compute-1 ovn_controller[133293]: 2026-01-23T10:01:34Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Jan 23 10:01:34 compute-1 systemd-udevd[133424]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:01:34 compute-1 ceph-mon[80126]: pgmap v240: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:01:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:35.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:35.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:35 compute-1 ovn_controller[133293]: 2026-01-23T10:01:35Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 10:01:35 compute-1 ovn_controller[133293]: 2026-01-23T10:01:35Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Jan 23 10:01:35 compute-1 ovn_controller[133293]: 2026-01-23T10:01:35Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 10:01:35 compute-1 ovn_controller[133293]: 2026-01-23T10:01:35Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Jan 23 10:01:35 compute-1 ovn_controller[133293]: 2026-01-23T10:01:35Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 10:01:35 compute-1 ovn_controller[133293]: 2026-01-23T10:01:35Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Jan 23 10:01:35 compute-1 NetworkManager[48978]: <info>  [1769162495.6945] manager: (ovn-eb059b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Jan 23 10:01:35 compute-1 NetworkManager[48978]: <info>  [1769162495.6999] manager: (ovn-57e418-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Jan 23 10:01:35 compute-1 systemd-udevd[133426]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:01:35 compute-1 kernel: genev_sys_6081: entered promiscuous mode
Jan 23 10:01:35 compute-1 NetworkManager[48978]: <info>  [1769162495.7152] device (genev_sys_6081): carrier: link connected
Jan 23 10:01:35 compute-1 NetworkManager[48978]: <info>  [1769162495.7157] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/21)
Jan 23 10:01:36 compute-1 ceph-mon[80126]: pgmap v241: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:01:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:01:36 compute-1 NetworkManager[48978]: <info>  [1769162496.1838] manager: (ovn-8fb585-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Jan 23 10:01:37 compute-1 python3.9[133557]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 10:01:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:37.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:37 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:01:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:37 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:01:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:37.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:38 compute-1 ceph-mon[80126]: pgmap v242: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 170 B/s wr, 0 op/s
Jan 23 10:01:38 compute-1 sudo[133708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvuriqjyztlaqvhywvyysyeehleywxch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162497.756444-1828-93450145547982/AnsiballZ_stat.py'
Jan 23 10:01:38 compute-1 sudo[133708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:38 compute-1 sudo[133711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:01:38 compute-1 python3.9[133710]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:38 compute-1 sudo[133711]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:01:38 compute-1 sudo[133711]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:38 compute-1 sudo[133708]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:38 compute-1 sudo[133856]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mvykncfdlqtbixmaqxrhwktfvyjsptdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162497.756444-1828-93450145547982/AnsiballZ_copy.py'
Jan 23 10:01:38 compute-1 sudo[133856]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:38 compute-1 python3.9[133858]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162497.756444-1828-93450145547982/.source.yaml _original_basename=.9dol0os0 follow=False checksum=a80724acad465d51ee59522dfe4a3a5c05876d7d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:01:38 compute-1 sudo[133856]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:01:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:39.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:01:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:39.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:39 compute-1 ceph-mon[80126]: pgmap v243: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:01:39 compute-1 sudo[134009]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vzbtefchwdcbejaojjtwddbvmvhfknlj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162499.3933847-1873-19974983787308/AnsiballZ_command.py'
Jan 23 10:01:39 compute-1 sudo[134009]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:39 compute-1 python3.9[134011]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:01:39 compute-1 ovs-vsctl[134012]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Jan 23 10:01:39 compute-1 sudo[134009]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:40 compute-1 sudo[134162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mszuopvziemeiqldrvarsumgszwhgaml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162500.1033833-1897-132826317207028/AnsiballZ_command.py'
Jan 23 10:01:40 compute-1 sudo[134162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:40 compute-1 python3.9[134164]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:01:40 compute-1 ovs-vsctl[134166]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Jan 23 10:01:40 compute-1 sudo[134162]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:41.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:41 compute-1 sudo[134318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxpaoykjfnapennwysjbxwiuvrhxozob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162501.1882293-1939-226789643414095/AnsiballZ_command.py'
Jan 23 10:01:41 compute-1 sudo[134318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:41 compute-1 ceph-mon[80126]: pgmap v244: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:01:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:41.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:41 compute-1 python3.9[134320]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:01:41 compute-1 ovs-vsctl[134321]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Jan 23 10:01:41 compute-1 sudo[134318]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:42 compute-1 sshd-session[122286]: Connection closed by 192.168.122.30 port 46160
Jan 23 10:01:42 compute-1 sshd-session[122283]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:01:42 compute-1 systemd[1]: session-49.scope: Deactivated successfully.
Jan 23 10:01:42 compute-1 systemd[1]: session-49.scope: Consumed 59.553s CPU time.
Jan 23 10:01:42 compute-1 systemd-logind[807]: Session 49 logged out. Waiting for processes to exit.
Jan 23 10:01:42 compute-1 systemd-logind[807]: Removed session 49.
Jan 23 10:01:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:43.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:43.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:43 compute-1 ceph-mon[80126]: pgmap v245: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:01:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:43 : epoch 697346fa : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:01:44 compute-1 systemd[1]: Stopping User Manager for UID 0...
Jan 23 10:01:44 compute-1 systemd[133317]: Activating special unit Exit the Session...
Jan 23 10:01:44 compute-1 systemd[133317]: Stopped target Main User Target.
Jan 23 10:01:44 compute-1 systemd[133317]: Stopped target Basic System.
Jan 23 10:01:44 compute-1 systemd[133317]: Stopped target Paths.
Jan 23 10:01:44 compute-1 systemd[133317]: Stopped target Sockets.
Jan 23 10:01:44 compute-1 systemd[133317]: Stopped target Timers.
Jan 23 10:01:44 compute-1 systemd[133317]: Stopped Daily Cleanup of User's Temporary Directories.
Jan 23 10:01:44 compute-1 systemd[133317]: Closed D-Bus User Message Bus Socket.
Jan 23 10:01:44 compute-1 systemd[133317]: Stopped Create User's Volatile Files and Directories.
Jan 23 10:01:44 compute-1 systemd[133317]: Removed slice User Application Slice.
Jan 23 10:01:44 compute-1 systemd[133317]: Reached target Shutdown.
Jan 23 10:01:44 compute-1 systemd[133317]: Finished Exit the Session.
Jan 23 10:01:44 compute-1 systemd[133317]: Reached target Exit the Session.
Jan 23 10:01:44 compute-1 systemd[1]: user@0.service: Deactivated successfully.
Jan 23 10:01:44 compute-1 systemd[1]: Stopped User Manager for UID 0.
Jan 23 10:01:44 compute-1 systemd[1]: Stopping User Runtime Directory /run/user/0...
Jan 23 10:01:44 compute-1 systemd[1]: run-user-0.mount: Deactivated successfully.
Jan 23 10:01:44 compute-1 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Jan 23 10:01:44 compute-1 systemd[1]: Stopped User Runtime Directory /run/user/0.
Jan 23 10:01:44 compute-1 systemd[1]: Removed slice User Slice of UID 0.
Jan 23 10:01:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:44 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9360000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:44 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:44 compute-1 ceph-mon[80126]: pgmap v246: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:01:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:45.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:45 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:45.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100146 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:01:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:46 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:46 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:47.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:47 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:47.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:47 compute-1 ceph-mon[80126]: pgmap v247: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:01:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:48 compute-1 sshd-session[134367]: Accepted publickey for zuul from 192.168.122.30 port 48480 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:01:48 compute-1 systemd-logind[807]: New session 51 of user zuul.
Jan 23 10:01:48 compute-1 systemd[1]: Started Session 51 of User zuul.
Jan 23 10:01:48 compute-1 sshd-session[134367]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:01:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:48 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:48 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:49 compute-1 ceph-mon[80126]: pgmap v248: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 852 B/s wr, 3 op/s
Jan 23 10:01:49 compute-1 python3.9[134520]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:01:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:01:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:49.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:01:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:49 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:01:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:49.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:01:50 compute-1 sudo[134675]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efalldefgvomobynraqrqofhwmzygkxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162509.8932712-58-133020893129925/AnsiballZ_file.py'
Jan 23 10:01:50 compute-1 sudo[134675]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:01:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:50 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93640023e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:50 compute-1 python3.9[134677]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:50 compute-1 sudo[134675]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:50 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:50 compute-1 sudo[134827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crrmpixcsgfyeckqhykijmcipieenrnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162510.7023144-58-60730772464336/AnsiballZ_file.py'
Jan 23 10:01:50 compute-1 sudo[134827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:51 compute-1 python3.9[134829]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:51 compute-1 sudo[134827]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:51.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:51 compute-1 ceph-mon[80126]: pgmap v249: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 10:01:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:51 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:51.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:51 compute-1 sudo[134980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hpehirxasukqwlyrzgbtxeriavrhsvcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162511.4693484-58-75853558032340/AnsiballZ_file.py'
Jan 23 10:01:51 compute-1 sudo[134980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:52 compute-1 python3.9[134982]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:52 compute-1 sudo[134980]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:52 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:52 compute-1 sudo[135133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ofsqlwitbfvlakxkxehsbotdmkjykmsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162512.2985446-58-37808751602371/AnsiballZ_file.py'
Jan 23 10:01:52 compute-1 sudo[135133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:52 compute-1 python3.9[135135]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:52 compute-1 sudo[135133]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:52 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:53 compute-1 sudo[135285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rhdcdewvpkqugxahsmybopzbnmkefdyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162512.9746516-58-102875581747846/AnsiballZ_file.py'
Jan 23 10:01:53 compute-1 sudo[135285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:01:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:53.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:01:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:53 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9358001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:53.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:53 compute-1 python3.9[135287]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:53 compute-1 sudo[135285]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:54 compute-1 ceph-mon[80126]: pgmap v250: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 425 B/s wr, 1 op/s
Jan 23 10:01:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:54 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:54 compute-1 python3.9[135438]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:01:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:54 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:55 compute-1 sudo[135588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sqtasddtfcfkdewimxsfgkrenjasoosr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162514.7564707-190-194302699995293/AnsiballZ_seboolean.py'
Jan 23 10:01:55 compute-1 sudo[135588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:01:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:55.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:55 compute-1 python3.9[135590]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Jan 23 10:01:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:55 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:55.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:56 compute-1 sudo[135588]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:56 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:56 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:56 compute-1 python3.9[135741]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:57.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:57 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:57.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:57 compute-1 python3.9[135863]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162516.3145523-214-124497351997352/.source follow=False _original_basename=haproxy.j2 checksum=1daf285be4abb25cbd7ba376734de140aac9aefe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:01:57 compute-1 ceph-mon[80126]: pgmap v251: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:01:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:01:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:58 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:58 compute-1 sudo[135888]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:01:58 compute-1 sudo[135888]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:01:58 compute-1 sudo[135888]: pam_unix(sudo:session): session closed for user root
Jan 23 10:01:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:58 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:59 compute-1 python3.9[136043]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:01:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:01:59.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:01:59 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:01:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:01:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:01:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:01:59.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:01:59 compute-1 python3.9[136166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162518.60924-259-248567993364976/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:00 compute-1 ceph-mon[80126]: pgmap v252: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:02:00 compute-1 ceph-mon[80126]: pgmap v253: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 340 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:02:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:00 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:00 compute-1 sudo[136316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kodxrioghmbwnlkvgbsljwrjmolrtojg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162520.474974-310-120243605101681/AnsiballZ_setup.py'
Jan 23 10:02:00 compute-1 sudo[136316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:00 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364002d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:01 compute-1 python3.9[136318]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 10:02:01 compute-1 sudo[136316]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:01.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:01 compute-1 ceph-mon[80126]: pgmap v254: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:01 compute-1 sshd-session[136319]: Invalid user sol from 45.148.10.240 port 60274
Jan 23 10:02:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:01 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:01.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:01 compute-1 sshd-session[136319]: Connection closed by invalid user sol 45.148.10.240 port 60274 [preauth]
Jan 23 10:02:01 compute-1 sudo[136403]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxkwctlcgqvvngzwdlbwtdrktdrirayb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162520.474974-310-120243605101681/AnsiballZ_dnf.py'
Jan 23 10:02:01 compute-1 sudo[136403]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:01 compute-1 python3.9[136405]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:02:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:02 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:02 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100203 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:02:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:03.412 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:03 compute-1 sudo[136403]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:03 compute-1 ceph-mon[80126]: pgmap v255: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:03 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:03.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:04 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:04 compute-1 ceph-mon[80126]: pgmap v256: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:04 compute-1 ovn_controller[133293]: 2026-01-23T10:02:04Z|00025|memory|INFO|16000 kB peak resident set size after 30.5 seconds
Jan 23 10:02:04 compute-1 ovn_controller[133293]: 2026-01-23T10:02:04Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:642 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Jan 23 10:02:04 compute-1 sudo[136581]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rpergxdtlntuiicrnjibgbcvlnxfsmye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162524.1016848-346-114341046074683/AnsiballZ_systemd.py'
Jan 23 10:02:04 compute-1 sudo[136581]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:04 compute-1 podman[136508]: 2026-01-23 10:02:04.720809817 +0000 UTC m=+0.113958898 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller)
Jan 23 10:02:04 compute-1 sudo[136588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:02:04 compute-1 sudo[136588]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:04 compute-1 sudo[136588]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:04 compute-1 sudo[136613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:02:04 compute-1 sudo[136613]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:04 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:05 compute-1 python3.9[136586]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:02:05 compute-1 sudo[136581]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:05.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:05 compute-1 sudo[136613]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:05 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:05.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:02:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:02:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:02:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:02:05 compute-1 python3.9[136823]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:06 compute-1 python3.9[136944]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162525.3221169-370-137297292832377/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:06 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:06 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:06 compute-1 python3.9[137094]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:07 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:02:07 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:02:07 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:02:07 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:02:07 compute-1 ceph-mon[80126]: pgmap v257: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:07.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:07 compute-1 python3.9[137215]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162526.517756-370-15219577684183/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:07 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:07.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:08 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:08 compute-1 python3.9[137366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:08 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:09 compute-1 ceph-mon[80126]: pgmap v258: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:02:09 compute-1 python3.9[137487]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162528.3279688-502-172779167216476/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:09.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:09 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:09.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:09 compute-1 python3.9[137638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:10 compute-1 python3.9[137759]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162529.431357-502-35243825754108/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:10 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:10 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:11 compute-1 python3.9[137909]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:02:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:11.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:11 compute-1 ceph-mon[80126]: pgmap v259: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:02:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:11 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:02:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:11.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:02:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:11 : epoch 697346fa : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:02:11 compute-1 sudo[138062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyifrzxzbmcwtdypcyzphovvzilrywre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162531.5426574-616-120448937158580/AnsiballZ_file.py'
Jan 23 10:02:11 compute-1 sudo[138062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:12 compute-1 python3.9[138064]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:12 compute-1 sudo[138062]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:12 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:12 compute-1 sudo[138214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-evroweciokgvrhtseamjhpmiumjqilry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162532.2585294-640-69237366669075/AnsiballZ_stat.py'
Jan 23 10:02:12 compute-1 sudo[138214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:12 compute-1 python3.9[138216]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:12 compute-1 sudo[138214]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:12 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:12 compute-1 sudo[138292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udhpqellvhvzqgrlfocpieomkwezxuoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162532.2585294-640-69237366669075/AnsiballZ_file.py'
Jan 23 10:02:13 compute-1 sudo[138292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:13 compute-1 python3.9[138294]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:13 compute-1 sudo[138292]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:13 compute-1 sudo[138316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:02:13 compute-1 sudo[138316]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:13 compute-1 sudo[138316]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:13.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:13 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:13 compute-1 ceph-mon[80126]: pgmap v260: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:02:13 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:02:13 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:02:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:13.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:13 compute-1 sudo[138470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oijpcbodhmaqtmhpzdzhavbddrsuqtec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162533.354254-640-27323747454233/AnsiballZ_stat.py'
Jan 23 10:02:13 compute-1 sudo[138470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:13 compute-1 python3.9[138472]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:13 compute-1 sudo[138470]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:14 compute-1 sudo[138548]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-scyhixadnyafoodyhcpebsitiauvhffm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162533.354254-640-27323747454233/AnsiballZ_file.py'
Jan 23 10:02:14 compute-1 sudo[138548]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:14 compute-1 python3.9[138550]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:14 compute-1 sudo[138548]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:14 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9364004830 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:14 compute-1 ceph-mon[80126]: pgmap v261: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:02:14 compute-1 sudo[138700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvlysacdrfmcsftiuucrcdcberlezbtt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162534.4562209-709-182494450091040/AnsiballZ_file.py'
Jan 23 10:02:14 compute-1 sudo[138700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:14 : epoch 697346fa : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:02:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:14 : epoch 697346fa : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:02:14 compute-1 python3.9[138702]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:14 compute-1 sudo[138700]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:14 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f933c003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:15.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:15 compute-1 sudo[138853]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oeajnfekulmztmmxqboytpcsyvwxbgfm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162535.1866286-733-143484222924941/AnsiballZ_stat.py'
Jan 23 10:02:15 compute-1 sudo[138853]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:15 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:15.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:15 compute-1 python3.9[138855]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:15 compute-1 sudo[138853]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:15 compute-1 sudo[138932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnfyikudaykpwfotnmwzxdntkzkxuyib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162535.1866286-733-143484222924941/AnsiballZ_file.py'
Jan 23 10:02:15 compute-1 sudo[138932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:16 compute-1 python3.9[138934]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:16 compute-1 sudo[138932]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:16 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:16 compute-1 sudo[139085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dehmllmvkjhznspjkoogvpiblkppmtyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162536.3221989-769-13553743535341/AnsiballZ_stat.py'
Jan 23 10:02:16 compute-1 sudo[139085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:16 compute-1 python3.9[139087]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:16 compute-1 sudo[139085]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:16 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9350000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:17 compute-1 ceph-mon[80126]: pgmap v262: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 170 B/s wr, 0 op/s
Jan 23 10:02:17 compute-1 sudo[139163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hgxkkgizgqysazhrdxqflcuqcpggfdic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162536.3221989-769-13553743535341/AnsiballZ_file.py'
Jan 23 10:02:17 compute-1 sudo[139163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:17 compute-1 python3.9[139165]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:17 compute-1 sudo[139163]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:17.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:17 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9340000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:17.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:17 compute-1 sudo[139316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oemnupvxpnuvgiskdrbrkkudqnfokvsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162537.493685-805-220637655288915/AnsiballZ_systemd.py'
Jan 23 10:02:17 compute-1 sudo[139316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:17 : epoch 697346fa : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:02:18 compute-1 python3.9[139318]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:02:18 compute-1 systemd[1]: Reloading.
Jan 23 10:02:18 compute-1 systemd-sysv-generator[139350]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:18 compute-1 systemd-rc-local-generator[139345]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:18 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:18 compute-1 sudo[139316]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:18 compute-1 sudo[139357]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:02:18 compute-1 sudo[139357]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:18 compute-1 sudo[139357]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:18 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:19 compute-1 sudo[139531]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vvlenuspalxukxffibebstcwdjvphrqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162538.7606199-829-38922441820431/AnsiballZ_stat.py'
Jan 23 10:02:19 compute-1 sudo[139531]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:19 compute-1 python3.9[139533]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:19 compute-1 sudo[139531]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:19 compute-1 ceph-mon[80126]: pgmap v263: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:02:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:19.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:19 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9350001aa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:19 compute-1 sudo[139610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-txtumbwpbxmdllrvcvtovqctvzpegocw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162538.7606199-829-38922441820431/AnsiballZ_file.py'
Jan 23 10:02:19 compute-1 sudo[139610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:02:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:19.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:02:19 compute-1 python3.9[139612]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:19 compute-1 sudo[139610]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:20 compute-1 sudo[139762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ghcuenyiugjwsprgqvettkbltporrgbh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162539.9371169-865-2108677580980/AnsiballZ_stat.py'
Jan 23 10:02:20 compute-1 sudo[139762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:20 compute-1 python3.9[139764]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:20 compute-1 sudo[139762]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:20 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:02:20 compute-1 sudo[139840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qilejcpbaobjesurdslkhfsmuzthumec ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162539.9371169-865-2108677580980/AnsiballZ_file.py'
Jan 23 10:02:20 compute-1 sudo[139840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:20 compute-1 python3.9[139842]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:20 compute-1 sudo[139840]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:20 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:21 compute-1 sudo[139993]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zjubsigytfrussapwivelnuytthiyxzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162541.0899506-901-94703488124002/AnsiballZ_systemd.py'
Jan 23 10:02:21 compute-1 sudo[139993]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:21.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:21 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:21 compute-1 ceph-mon[80126]: pgmap v264: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:02:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:21.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:21 compute-1 python3.9[139995]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:02:21 compute-1 systemd[1]: Reloading.
Jan 23 10:02:21 compute-1 systemd-rc-local-generator[140023]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:21 compute-1 systemd-sysv-generator[140028]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:22 compute-1 systemd[1]: Starting Create netns directory...
Jan 23 10:02:22 compute-1 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Jan 23 10:02:22 compute-1 systemd[1]: netns-placeholder.service: Deactivated successfully.
Jan 23 10:02:22 compute-1 systemd[1]: Finished Create netns directory.
Jan 23 10:02:22 compute-1 sudo[139993]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:22 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9350001aa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:22 compute-1 sudo[140187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mzqzxkxvmdkooascwrktkfoxocleqqmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162542.4676387-931-168425407566815/AnsiballZ_file.py'
Jan 23 10:02:22 compute-1 sudo[140187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:22 compute-1 python3.9[140189]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:22 compute-1 sudo[140187]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:22 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93400016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:23 compute-1 ceph-mon[80126]: pgmap v265: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:02:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100223 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:02:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:23.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:23 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:23 compute-1 sudo[140340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huzldakkqgadtpqgnovynqvwvxazydnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162543.319567-955-14467157977317/AnsiballZ_stat.py'
Jan 23 10:02:23 compute-1 sudo[140340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:23.612 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:23 compute-1 python3.9[140342]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:23 compute-1 sudo[140340]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:24 compute-1 sudo[140463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmcmnndxnfbmkxvsrcptozkeskmxbiyr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162543.319567-955-14467157977317/AnsiballZ_copy.py'
Jan 23 10:02:24 compute-1 sudo[140463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:24 compute-1 python3.9[140465]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162543.319567-955-14467157977317/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:24 compute-1 sudo[140463]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:24 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:24 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93500027b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:25 compute-1 sudo[140615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dxjdusnkykoopzxzmzoajsdwxnprhryz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162544.8695407-1006-128336380295845/AnsiballZ_file.py'
Jan 23 10:02:25 compute-1 sudo[140615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:25 compute-1 python3.9[140617]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:25 compute-1 sudo[140615]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:25.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:25 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93500027b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:25.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:25 compute-1 ceph-mon[80126]: pgmap v266: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:02:25 compute-1 sudo[140768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-teazmqjetokptsrnxqaeujjkffttbwcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162545.5909162-1030-237377552512388/AnsiballZ_file.py'
Jan 23 10:02:25 compute-1 sudo[140768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:26 compute-1 python3.9[140770]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:02:26 compute-1 sudo[140768]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:26 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:26 compute-1 sudo[140920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjulrogduqopmabetqvudswjhfmscpwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162546.3477795-1054-214909290537178/AnsiballZ_stat.py'
Jan 23 10:02:26 compute-1 sudo[140920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:26 compute-1 python3.9[140922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:02:26 compute-1 sudo[140920]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:26 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:27 compute-1 sudo[141043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wypwbybgkfpiuahiijtnwuimkqgtpfda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162546.3477795-1054-214909290537178/AnsiballZ_copy.py'
Jan 23 10:02:27 compute-1 sudo[141043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:27 compute-1 python3.9[141045]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162546.3477795-1054-214909290537178/.source.json _original_basename=.xs2hfnlj follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:27 compute-1 ceph-mon[80126]: pgmap v267: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:02:27 compute-1 sudo[141043]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:27.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:27 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:02:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:27.619 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:02:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:28 compute-1 python3.9[141196]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:28 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:28 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:29.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:29 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9340002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:29.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:29 compute-1 ceph-mon[80126]: pgmap v268: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 852 B/s wr, 2 op/s
Jan 23 10:02:30 compute-1 sudo[141618]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwmaubzxikectqzoafussiuljbmjuqmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162549.7321877-1174-66314845931107/AnsiballZ_container_config_data.py'
Jan 23 10:02:30 compute-1 sudo[141618]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:30 compute-1 python3.9[141620]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Jan 23 10:02:30 compute-1 sudo[141618]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9340002720 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:30 compute-1 ceph-mon[80126]: pgmap v269: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:02:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:30 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f9344003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:31 compute-1 sudo[141770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzmdvjlbchlkxgpfebendsllijswozdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162550.8911057-1207-257896167033408/AnsiballZ_container_config_hash.py'
Jan 23 10:02:31 compute-1 sudo[141770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:31.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:31 compute-1 python3.9[141772]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 10:02:31 compute-1 sudo[141770]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:31 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:31.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:32 compute-1 sudo[141923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pjjumsdpebyoyawbahpetegocjxeodwx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162551.8596308-1237-189447117566946/AnsiballZ_edpm_container_manage.py'
Jan 23 10:02:32 compute-1 sudo[141923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[132780]: 23/01/2026 10:02:32 : epoch 697346fa : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f93580034e0 fd 39 proxy ignored for local
Jan 23 10:02:32 compute-1 kernel: ganesha.nfsd[134363]: segfault at 50 ip 00007f93eb60432e sp 00007f9354ff8210 error 4 in libntirpc.so.5.8[7f93eb5e9000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 23 10:02:32 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:02:32 compute-1 systemd[1]: Started Process Core Dump (PID 141926/UID 0).
Jan 23 10:02:32 compute-1 python3[141925]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 10:02:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:33.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:33.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:33 compute-1 systemd-coredump[141927]: Process 132803 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 53:
                                                    #0  0x00007f93eb60432e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:02:33 compute-1 systemd[1]: systemd-coredump@5-141926-0.service: Deactivated successfully.
Jan 23 10:02:33 compute-1 systemd[1]: systemd-coredump@5-141926-0.service: Consumed 1.136s CPU time.
Jan 23 10:02:33 compute-1 podman[141979]: 2026-01-23 10:02:33.993170276 +0000 UTC m=+0.034869697 container died 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:02:34 compute-1 systemd[1]: var-lib-containers-storage-overlay-d3256e0af33385ec147484681168dd4b70efb6ad14c7bc5b7ac1287f6f69b832-merged.mount: Deactivated successfully.
Jan 23 10:02:34 compute-1 podman[141979]: 2026-01-23 10:02:34.094314093 +0000 UTC m=+0.136013494 container remove 67e5dc04ee0e690397519d9e2058fcc55763d7801d7910fdb8358f32621cd6cc (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, ceph=True)
Jan 23 10:02:34 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:02:34 compute-1 ceph-mon[80126]: pgmap v270: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:02:34 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 10:02:34 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.263s CPU time.
Jan 23 10:02:34 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 10:02:34 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 10:02:35 compute-1 ceph-mon[80126]: pgmap v271: 353 pgs: 353 active+clean; 458 KiB data, 149 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:02:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:02:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:02:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:35.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:02:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:35.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:36 compute-1 podman[142034]: 2026-01-23 10:02:36.12566918 +0000 UTC m=+0.523642641 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 10:02:37 compute-1 ceph-mon[80126]: pgmap v272: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 32 KiB/s rd, 0 B/s wr, 52 op/s
Jan 23 10:02:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:37.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:02:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:37.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:02:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100238 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:02:38 compute-1 sudo[142090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:02:38 compute-1 sudo[142090]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:38 compute-1 sudo[142090]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:39.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:02:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:39.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:02:40 compute-1 ceph-mon[80126]: pgmap v273: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 0 B/s wr, 73 op/s
Jan 23 10:02:41 compute-1 podman[141940]: 2026-01-23 10:02:41.193248263 +0000 UTC m=+8.505194511 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:02:41 compute-1 podman[142168]: 2026-01-23 10:02:41.352787499 +0000 UTC m=+0.041784698 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:02:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:41.453 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:41.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:41 compute-1 podman[142168]: 2026-01-23 10:02:41.721097278 +0000 UTC m=+0.410094377 container create e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Jan 23 10:02:41 compute-1 python3[141925]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:02:41 compute-1 sudo[141923]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:43.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 10:02:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:43.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 10:02:44 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 6.
Jan 23 10:02:44 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:02:44 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.263s CPU time.
Jan 23 10:02:44 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:02:44 compute-1 podman[142279]: 2026-01-23 10:02:44.538409547 +0000 UTC m=+0.041488000 container create 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS)
Jan 23 10:02:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:44 compute-1 podman[142279]: 2026-01-23 10:02:44.591959897 +0000 UTC m=+0.095038390 container init 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Jan 23 10:02:44 compute-1 podman[142279]: 2026-01-23 10:02:44.596505467 +0000 UTC m=+0.099583920 container start 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid)
Jan 23 10:02:44 compute-1 bash[142279]: 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb
Jan 23 10:02:44 compute-1 podman[142279]: 2026-01-23 10:02:44.52141597 +0000 UTC m=+0.024494463 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:02:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:02:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:02:44 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:02:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:02:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:02:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:02:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:02:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:02:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:02:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:45.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:45.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:46 compute-1 ceph-mon[80126]: pgmap v274: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 0 B/s wr, 73 op/s
Jan 23 10:02:46 compute-1 ceph-mon[80126]: pgmap v275: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 KiB/s rd, 0 B/s wr, 141 op/s
Jan 23 10:02:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:47.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:47.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:48 compute-1 sudo[142464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-okbschzuequanlmwjalaefemixpofkpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162568.40954-1261-113238433545632/AnsiballZ_stat.py'
Jan 23 10:02:48 compute-1 sudo[142464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:48 compute-1 ceph-mon[80126]: pgmap v276: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 KiB/s rd, 0 B/s wr, 141 op/s
Jan 23 10:02:48 compute-1 ceph-mon[80126]: pgmap v277: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 90 KiB/s rd, 85 B/s wr, 150 op/s
Jan 23 10:02:48 compute-1 python3.9[142466]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:02:48 compute-1 sudo[142464]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:49.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:49 compute-1 sudo[142619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pppffstniskboxxtkfdpvwouzmopzlqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162569.2508705-1288-232674095937113/AnsiballZ_file.py'
Jan 23 10:02:49 compute-1 sudo[142619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:49.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:49 compute-1 python3.9[142621]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:49 compute-1 sudo[142619]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:49 compute-1 sudo[142695]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xqorqgvrehwnyanaxonfuojfjxludomp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162569.2508705-1288-232674095937113/AnsiballZ_stat.py'
Jan 23 10:02:49 compute-1 sudo[142695]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:50 compute-1 python3.9[142697]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:02:50 compute-1 sudo[142695]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:50 compute-1 ceph-mon[80126]: pgmap v278: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 59 KiB/s rd, 85 B/s wr, 97 op/s
Jan 23 10:02:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:02:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:02:50 compute-1 sudo[142846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-muoezbfcpbiykttianzszlkblccbqena ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162570.24057-1288-43555596506973/AnsiballZ_copy.py'
Jan 23 10:02:50 compute-1 sudo[142846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:51 compute-1 python3.9[142848]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769162570.24057-1288-43555596506973/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:02:51 compute-1 sudo[142846]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:51 compute-1 sudo[142922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gamiexcrfpylpupyhtehtzbmanrnarnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162570.24057-1288-43555596506973/AnsiballZ_systemd.py'
Jan 23 10:02:51 compute-1 sudo[142922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:51.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:51 compute-1 python3.9[142924]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:02:51 compute-1 systemd[1]: Reloading.
Jan 23 10:02:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:51.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:51 compute-1 systemd-rc-local-generator[142955]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:51 compute-1 systemd-sysv-generator[142959]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:51 compute-1 sudo[142922]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:52 compute-1 sudo[143034]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dcnixwbfunziybefnayvgabmeagrsslq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162570.24057-1288-43555596506973/AnsiballZ_systemd.py'
Jan 23 10:02:52 compute-1 sudo[143034]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:02:52 compute-1 python3.9[143036]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:02:52 compute-1 systemd[1]: Reloading.
Jan 23 10:02:52 compute-1 systemd-rc-local-generator[143066]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:52 compute-1 systemd-sysv-generator[143069]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:52 compute-1 systemd[1]: Starting ovn_metadata_agent container...
Jan 23 10:02:52 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:02:53 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebdebd315409c6b64b2dfe6f1c70aa65fc33d875c067e0175d97367d40cd3030/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:53 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebdebd315409c6b64b2dfe6f1c70aa65fc33d875c067e0175d97367d40cd3030/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:02:53 compute-1 systemd[1]: Started /usr/bin/podman healthcheck run e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6.
Jan 23 10:02:53 compute-1 podman[143077]: 2026-01-23 10:02:53.045373287 +0000 UTC m=+0.150750584 container init e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: + sudo -E kolla_set_configs
Jan 23 10:02:53 compute-1 podman[143077]: 2026-01-23 10:02:53.075532983 +0000 UTC m=+0.180910300 container start e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 10:02:53 compute-1 edpm-start-podman-container[143077]: ovn_metadata_agent
Jan 23 10:02:53 compute-1 podman[143099]: 2026-01-23 10:02:53.13540116 +0000 UTC m=+0.050113710 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:02:53 compute-1 edpm-start-podman-container[143076]: Creating additional drop-in dependency for "ovn_metadata_agent" (e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6)
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Validating config file
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Copying service configuration files
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Writing out command to execute
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/external
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: ++ cat /run_command
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: + CMD=neutron-ovn-metadata-agent
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: + ARGS=
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: + sudo kolla_copy_cacerts
Jan 23 10:02:53 compute-1 systemd[1]: Reloading.
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: + [[ ! -n '' ]]
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: + . kolla_extend_start
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: Running command: 'neutron-ovn-metadata-agent'
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: + umask 0022
Jan 23 10:02:53 compute-1 ovn_metadata_agent[143093]: + exec neutron-ovn-metadata-agent
Jan 23 10:02:53 compute-1 systemd-rc-local-generator[143170]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:02:53 compute-1 systemd-sysv-generator[143176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:02:53 compute-1 systemd[1]: Started ovn_metadata_agent container.
Jan 23 10:02:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:53.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:53 compute-1 sudo[143034]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:53.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.993 143098 INFO neutron.common.config [-] Logging enabled!
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.993 143098 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.993 143098 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.994 143098 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.995 143098 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.996 143098 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.997 143098 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.998 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:54.999 143098 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.000 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.001 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.002 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.003 143098 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.004 143098 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.005 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.006 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.007 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.008 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.009 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.010 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.011 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.012 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.013 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.014 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.015 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.016 143098 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.017 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.018 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.019 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.020 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.021 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.022 143098 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.023 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.024 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.025 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.026 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.027 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.027 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.027 143098 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.027 143098 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.036 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.036 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.036 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.037 143098 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.037 143098 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.056 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 170ec811-bf2b-4b3a-9339-50a49c79a1e6 (UUID: 170ec811-bf2b-4b3a-9339-50a49c79a1e6) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.079 143098 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.080 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.080 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.080 143098 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.083 143098 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.088 143098 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.095 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '170ec811-bf2b-4b3a-9339-50a49c79a1e6'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], external_ids={}, name=170ec811-bf2b-4b3a-9339-50a49c79a1e6, nb_cfg_timestamp=1769162502251, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.096 143098 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f31a2f53f40>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.097 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.097 143098 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.097 143098 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.098 143098 INFO oslo_service.service [-] Starting 1 workers
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.102 143098 DEBUG oslo_service.service [-] Started child 143210 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.106 143098 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmplrjtqy92/privsep.sock']
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.107 143210 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-1999580'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.129 143210 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.129 143210 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.129 143210 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.132 143210 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.140 143210 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.146 143210 INFO eventlet.wsgi.server [-] (143210) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Jan 23 10:02:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:55.476 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:55.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:55 compute-1 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Jan 23 10:02:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:02:55 compute-1 ceph-mon[80126]: pgmap v279: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 46 KiB/s rd, 85 B/s wr, 77 op/s
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.813 143098 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.814 143098 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmplrjtqy92/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.675 143216 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.680 143216 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.683 143216 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.684 143216 INFO oslo.privsep.daemon [-] privsep daemon running as pid 143216
Jan 23 10:02:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:55.817 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[97a1aa72-315d-4cf5-82dd-aa959237d05f]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.335 143216 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.335 143216 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.335 143216 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:02:56 compute-1 ceph-mon[80126]: pgmap v280: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 51 KiB/s rd, 938 B/s wr, 83 op/s
Jan 23 10:02:56 compute-1 ceph-mon[80126]: pgmap v281: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 9.4 KiB/s rd, 938 B/s wr, 15 op/s
Jan 23 10:02:56 compute-1 ceph-mon[80126]: pgmap v282: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 9.7 KiB/s rd, 1023 B/s wr, 15 op/s
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.880 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[f14283d7-5ad3-4da1-a385-14b4d45ef7b4]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.882 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, column=external_ids, values=({'neutron:ovn-metadata-id': '49259f78-9be2-5e1c-94bb-1c5d5138e24a'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.896 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.903 143098 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.903 143098 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.904 143098 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.905 143098 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.906 143098 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.907 143098 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.908 143098 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] host                           = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.909 143098 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.910 143098 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.911 143098 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.912 143098 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.913 143098 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.914 143098 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.915 143098 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.916 143098 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.917 143098 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.918 143098 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.919 143098 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.920 143098 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.921 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.922 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.923 143098 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.924 143098 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.925 143098 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.926 143098 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.927 143098 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.928 143098 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.929 143098 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.930 143098 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.931 143098 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.932 143098 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.933 143098 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.934 143098 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.935 143098 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.936 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.937 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.938 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.939 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.940 143098 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.941 143098 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.941 143098 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:02:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:02:56.941 143098 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 10:02:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000015s ======
Jan 23 10:02:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:57.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000015s
Jan 23 10:02:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:57.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:02:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:02:58 compute-1 python3.9[143361]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Jan 23 10:02:58 compute-1 sudo[143374]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:02:58 compute-1 sudo[143374]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:02:58 compute-1 sudo[143374]: pam_unix(sudo:session): session closed for user root
Jan 23 10:02:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:59 compute-1 ceph-mon[80126]: pgmap v283: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 4.5 KiB/s rd, 938 B/s wr, 7 op/s
Jan 23 10:02:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:02:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:02:59.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:02:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:02:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:02:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:02:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000015s ======
Jan 23 10:02:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:02:59.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000015s
Jan 23 10:02:59 compute-1 sudo[143537]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ecntbnpkvahbxyvgmwrgkhoftrnhthxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162579.658673-1423-8608494354108/AnsiballZ_stat.py'
Jan 23 10:02:59 compute-1 sudo[143537]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:00 compute-1 python3.9[143539]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:03:00 compute-1 sudo[143537]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:00 compute-1 sudo[143662]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-khnoljkxjjjqkurycpmzsfrqvnteoqow ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162579.658673-1423-8608494354108/AnsiballZ_copy.py'
Jan 23 10:03:00 compute-1 sudo[143662]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:00 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:00 compute-1 python3.9[143664]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162579.658673-1423-8608494354108/.source.yaml _original_basename=.i0sf62_b follow=False checksum=d88282ad6bcd11f7bd2cbc3f4703eb6122d6b05d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:00 compute-1 sudo[143662]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:01 compute-1 sshd-session[134370]: Connection closed by 192.168.122.30 port 48480
Jan 23 10:03:01 compute-1 sshd-session[134367]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:03:01 compute-1 systemd[1]: session-51.scope: Deactivated successfully.
Jan 23 10:03:01 compute-1 systemd[1]: session-51.scope: Consumed 56.527s CPU time.
Jan 23 10:03:01 compute-1 systemd-logind[807]: Session 51 logged out. Waiting for processes to exit.
Jan 23 10:03:01 compute-1 systemd-logind[807]: Removed session 51.
Jan 23 10:03:01 compute-1 ceph-mon[80126]: pgmap v284: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 4.5 KiB/s rd, 938 B/s wr, 7 op/s
Jan 23 10:03:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:01.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:01.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100302 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:03:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:02 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001680 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:02 compute-1 ceph-mon[80126]: pgmap v285: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 4.6 KiB/s rd, 938 B/s wr, 7 op/s
Jan 23 10:03:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 10:03:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:03.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 10:03:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000015s ======
Jan 23 10:03:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:03.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000015s
Jan 23 10:03:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:04 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 10:03:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:05.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 10:03:05 compute-1 ceph-mon[80126]: pgmap v286: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:03:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:03:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 10:03:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:05.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 10:03:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:07 compute-1 ceph-mon[80126]: pgmap v287: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:03:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000016s ======
Jan 23 10:03:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:07.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000016s
Jan 23 10:03:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80140016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:07.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:08 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c002d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034008dc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:09 compute-1 ceph-mon[80126]: pgmap v288: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:03:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:09.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:09.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:09 compute-1 podman[143694]: 2026-01-23 10:03:09.756355299 +0000 UTC m=+0.147780020 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 10:03:10 compute-1 sshd-session[143722]: Accepted publickey for zuul from 192.168.122.30 port 48434 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:03:10 compute-1 systemd-logind[807]: New session 52 of user zuul.
Jan 23 10:03:10 compute-1 systemd[1]: Started Session 52 of User zuul.
Jan 23 10:03:10 compute-1 sshd-session[143722]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:03:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:10 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c002d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:11 compute-1 python3.9[143875]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:03:11 compute-1 sshd-session[143876]: banner exchange: Connection from 175.126.176.50 port 50134: invalid format
Jan 23 10:03:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:11.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034008f40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:11 compute-1 ceph-mon[80126]: pgmap v289: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:03:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:11.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:12 compute-1 sudo[144031]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-odopwmoqbrhypitqkkhtpclyowiewuck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162591.8196206-59-146651263006068/AnsiballZ_command.py'
Jan 23 10:03:12 compute-1 sudo[144031]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:12 compute-1 python3.9[144033]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:12 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:12 compute-1 sudo[144031]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:13.508 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:13 compute-1 sudo[144147]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:03:13 compute-1 sudo[144147]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:13 compute-1 sudo[144147]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:13 compute-1 sudo[144239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kqdsslcvxhhxfxnnpatdwjlxktdpazia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162592.9974353-91-53748411611551/AnsiballZ_systemd_service.py'
Jan 23 10:03:13 compute-1 sudo[144239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:13 compute-1 sudo[144203]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:03:13 compute-1 sudo[144203]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:13.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:13 compute-1 python3.9[144247]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:03:13 compute-1 systemd[1]: Reloading.
Jan 23 10:03:14 compute-1 systemd-rc-local-generator[144293]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:03:14 compute-1 systemd-sysv-generator[144298]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:03:14 compute-1 sudo[144203]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:14 compute-1 sudo[144239]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:14 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:14 compute-1 ceph-mon[80126]: pgmap v290: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:03:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:15 compute-1 python3.9[144467]: ansible-ansible.builtin.service_facts Invoked
Jan 23 10:03:15 compute-1 network[144485]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 10:03:15 compute-1 network[144486]: 'network-scripts' will be removed from distribution in near future.
Jan 23 10:03:15 compute-1 network[144487]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 10:03:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:15.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:15.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:16 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:17 compute-1 ceph-mon[80126]: pgmap v291: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:03:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:03:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:03:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:03:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:03:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:03:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:03:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:17.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:17.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:18 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:18 compute-1 ceph-mon[80126]: pgmap v292: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:03:18 compute-1 sudo[144577]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:03:18 compute-1 sudo[144577]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:18 compute-1 sudo[144577]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:19.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:19.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:19 compute-1 ceph-mon[80126]: pgmap v293: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:20 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:03:21 compute-1 ceph-mon[80126]: pgmap v294: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:21 compute-1 sudo[144774]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qjqedqfgjeynczivjtnpnfoicipabjip ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162600.7635188-148-49090123163121/AnsiballZ_systemd_service.py'
Jan 23 10:03:21 compute-1 sudo[144774]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:21 compute-1 python3.9[144776]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:21 compute-1 sudo[144774]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:21.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034009860 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:21.763 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:21 compute-1 sudo[144928]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufxuayhefvkfnarffgbuvyeuhjjuzozg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162601.6102688-148-226044055841927/AnsiballZ_systemd_service.py'
Jan 23 10:03:21 compute-1 sudo[144928]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:22 compute-1 python3.9[144930]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:22 compute-1 sudo[144928]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:22 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:22 compute-1 ceph-mon[80126]: pgmap v295: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:03:22 compute-1 sudo[145081]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rwkhgtayvsicpbcsdkbvwjsyeienoren ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162602.4422338-148-278664750069429/AnsiballZ_systemd_service.py'
Jan 23 10:03:22 compute-1 sudo[145081]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:23 compute-1 python3.9[145083]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:23 compute-1 sudo[145081]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:23.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:23 compute-1 sudo[145244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mavuicfcgtbmraokoercpeuhoceqljzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162603.2449708-148-95943323650968/AnsiballZ_systemd_service.py'
Jan 23 10:03:23 compute-1 sudo[145244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:23 compute-1 podman[145209]: 2026-01-23 10:03:23.622785847 +0000 UTC m=+0.084185946 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:03:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:23.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:23 compute-1 python3.9[145254]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:23 compute-1 sudo[145244]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:24 compute-1 sudo[145407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-anwvzjclsvcdaovplsnydzkfcctzrbwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162604.0888977-148-233026742476012/AnsiballZ_systemd_service.py'
Jan 23 10:03:24 compute-1 sudo[145407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:24 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:24 compute-1 python3.9[145409]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:24 compute-1 sudo[145407]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:25 compute-1 sudo[145560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejglripkjojrloumgphwjovsramekqjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162604.8295796-148-207683236145343/AnsiballZ_systemd_service.py'
Jan 23 10:03:25 compute-1 sudo[145560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:25 compute-1 ceph-mon[80126]: pgmap v296: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:25 compute-1 python3.9[145562]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:25 compute-1 sudo[145560]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:03:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:25.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:03:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:25.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:25 compute-1 sudo[145714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mwhqayhucxiqtiehabrkbnmfajqgdvde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162605.6339355-148-204258562411475/AnsiballZ_systemd_service.py'
Jan 23 10:03:25 compute-1 sudo[145714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:26 compute-1 python3.9[145716]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:03:26 compute-1 sudo[145714]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:26 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:26 compute-1 sudo[145742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:03:26 compute-1 sudo[145742]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:26 compute-1 sudo[145742]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:27 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:03:27 compute-1 ceph-mon[80126]: pgmap v297: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:03:27 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:03:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:27.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:27 compute-1 sudo[145893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jxmkrdueuqxqxziriqyfzsxqayijqvrk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162607.2057743-304-184414355669170/AnsiballZ_file.py'
Jan 23 10:03:27 compute-1 sudo[145893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:27.774 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:27 compute-1 python3.9[145895]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:27 compute-1 sudo[145893]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:28 compute-1 sudo[146045]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kurqnkblffdnsvexutysmcbpwlzpjspl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162608.09683-304-34571071186608/AnsiballZ_file.py'
Jan 23 10:03:28 compute-1 sudo[146045]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:28 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:28 compute-1 python3.9[146047]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:28 compute-1 sudo[146045]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:29 compute-1 sudo[146197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qsvebunisykzbnklzgouakqzlsyacpps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162608.8468487-304-59003289251414/AnsiballZ_file.py'
Jan 23 10:03:29 compute-1 sudo[146197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:29 compute-1 python3.9[146199]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:29 compute-1 sudo[146197]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:29.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:29 compute-1 ceph-mon[80126]: pgmap v298: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:29.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:29 compute-1 sudo[146352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nrqodthdirtrdvpxyrlvfitvmrgpjqql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162609.6239603-304-231628995458482/AnsiballZ_file.py'
Jan 23 10:03:29 compute-1 sudo[146352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:30 compute-1 python3.9[146354]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:30 compute-1 sudo[146352]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:30 compute-1 sudo[146504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xwagifnhnueeyuxoyolcxhvjiahoqupy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162610.3039017-304-269279353620570/AnsiballZ_file.py'
Jan 23 10:03:30 compute-1 sudo[146504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:30 compute-1 ceph-mon[80126]: pgmap v299: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:30 compute-1 python3.9[146506]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:30 compute-1 sudo[146504]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:31 compute-1 sudo[146656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jzhkfweusvwmabxktfbkzvbqvigimzli ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162611.0144572-304-177836413515607/AnsiballZ_file.py'
Jan 23 10:03:31 compute-1 sudo[146656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:31.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:31 compute-1 python3.9[146659]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:31 compute-1 sudo[146656]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:31.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:32 compute-1 sudo[146809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lhhtxdaqxldkkgezarzxuvuuzakpvkni ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162611.7344158-304-123189771139410/AnsiballZ_file.py'
Jan 23 10:03:32 compute-1 sudo[146809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:32 compute-1 python3.9[146811]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:32 compute-1 sudo[146809]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:32 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:32 compute-1 sudo[146961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-avbtwgrhuslrgcufkdibepszswfzhcxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162612.4172885-454-166468562980883/AnsiballZ_file.py'
Jan 23 10:03:32 compute-1 sudo[146961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:32 compute-1 python3.9[146963]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:32 compute-1 sudo[146961]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:33 compute-1 ceph-mon[80126]: pgmap v300: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:03:33 compute-1 sudo[147114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jrvtdolmgakmxqfsgtvkeohpfjvgapll ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162613.0723763-454-206038092352308/AnsiballZ_file.py'
Jan 23 10:03:33 compute-1 sudo[147114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:03:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:33.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:03:33 compute-1 python3.9[147116]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:33 compute-1 sudo[147114]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:33.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:33 compute-1 sudo[147266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iekczrwhmocvjhsmiqkjfigqkkpjnlbe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162613.7181995-454-111574640931597/AnsiballZ_file.py'
Jan 23 10:03:33 compute-1 sudo[147266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:34 compute-1 python3.9[147268]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:34 compute-1 sudo[147266]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:34 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:34 compute-1 sudo[147418]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xbagxzbozuxjlqhvdpunlsosbktcgzps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162614.3231611-454-22225996840373/AnsiballZ_file.py'
Jan 23 10:03:34 compute-1 sudo[147418]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:34 compute-1 python3.9[147420]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:34 compute-1 sudo[147418]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:35 compute-1 sudo[147570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nzcrudghdesrwosllmdptwiacridukqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162615.0132806-454-214418433953943/AnsiballZ_file.py'
Jan 23 10:03:35 compute-1 sudo[147570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:35 compute-1 ceph-mon[80126]: pgmap v301: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:03:35 compute-1 python3.9[147572]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:35 compute-1 sudo[147570]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:35.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:35.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:35 compute-1 sudo[147723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmqzxfpqrttoogghjmjufykocldkikms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162615.6267235-454-58974073431433/AnsiballZ_file.py'
Jan 23 10:03:35 compute-1 sudo[147723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:36 compute-1 python3.9[147725]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:36 compute-1 sudo[147723]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:36 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:36 compute-1 sudo[147875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yzgdhapdhitmhkpahovwoxrobiwclabr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162616.253236-454-213136998507321/AnsiballZ_file.py'
Jan 23 10:03:36 compute-1 sudo[147875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:36 compute-1 python3.9[147877]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:03:36 compute-1 sudo[147875]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:37 compute-1 ceph-mon[80126]: pgmap v302: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:03:37 compute-1 sudo[148028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lghgyynewqiazvcnrupmjkkoemvnfhxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162617.2026367-607-225349265770139/AnsiballZ_command.py'
Jan 23 10:03:37 compute-1 sudo[148028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:37.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:37 compute-1 python3.9[148030]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:37 compute-1 sudo[148028]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:37.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:38 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:38 compute-1 ceph-mon[80126]: pgmap v303: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:38 compute-1 python3.9[148182]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 10:03:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80180016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:39 compute-1 sudo[148207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:03:39 compute-1 sudo[148207]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:39 compute-1 sudo[148207]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:39 compute-1 sudo[148358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wvctyutpxvpxeycdvehdkdxxgadtnxxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162619.1872885-661-33040829516455/AnsiballZ_systemd_service.py'
Jan 23 10:03:39 compute-1 sudo[148358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:39.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:39 compute-1 python3.9[148360]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:03:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:39.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:39 compute-1 systemd[1]: Reloading.
Jan 23 10:03:39 compute-1 systemd-rc-local-generator[148404]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:03:39 compute-1 systemd-sysv-generator[148408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:03:39 compute-1 podman[148362]: 2026-01-23 10:03:39.932606284 +0000 UTC m=+0.111219943 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 23 10:03:40 compute-1 sudo[148358]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:40 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:40 compute-1 sudo[148572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sxbooxyhthldyuynusftdhwsxxmemrxm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162620.3529506-685-102581740803703/AnsiballZ_command.py'
Jan 23 10:03:40 compute-1 sudo[148572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:40 compute-1 python3.9[148574]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:40 compute-1 sudo[148572]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:41 compute-1 sudo[148725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dpizdncwcrgliciwwrvvzcefwcifktor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162621.05084-685-105473080773022/AnsiballZ_command.py'
Jan 23 10:03:41 compute-1 sudo[148725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:41.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:41 compute-1 python3.9[148728]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:41 compute-1 ceph-mon[80126]: pgmap v304: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:41.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:42 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:42 compute-1 sudo[148725]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:42 compute-1 ceph-mon[80126]: pgmap v305: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:03:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:43 compute-1 sudo[148879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nooqxjzyqgqwtutkjmijtozmwhtkcnhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162622.784344-685-16327980489529/AnsiballZ_command.py'
Jan 23 10:03:43 compute-1 sudo[148879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:43 compute-1 python3.9[148881]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:43 compute-1 sudo[148879]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:43.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:43.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:43 compute-1 sudo[149033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dtjtndrkouoeqehztnjodkxdyotbasft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162623.516125-685-245802191465699/AnsiballZ_command.py'
Jan 23 10:03:43 compute-1 sudo[149033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:43 compute-1 python3.9[149035]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:44 compute-1 sudo[149033]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:44 compute-1 sudo[149186]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uaoaqruplbbjecpitungtzhirpnaxjcv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162624.1622841-685-241308205137256/AnsiballZ_command.py'
Jan 23 10:03:44 compute-1 sudo[149186]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:44 compute-1 python3.9[149188]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:44 compute-1 sudo[149186]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:45 compute-1 sudo[149339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmlebarywcsfiuubetfmtfyypehslhib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162624.8695526-685-83786487997976/AnsiballZ_command.py'
Jan 23 10:03:45 compute-1 sudo[149339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:45 compute-1 ceph-mon[80126]: pgmap v306: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:45 compute-1 python3.9[149341]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:45 compute-1 sudo[149339]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:45.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:45.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:46 compute-1 sudo[149493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kdcddkrqqyamowuzhjxhxwbtszmkbbqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162625.5717795-685-169633769738422/AnsiballZ_command.py'
Jan 23 10:03:46 compute-1 sudo[149493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:46 compute-1 python3.9[149495]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:03:46 compute-1 sudo[149493]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:46 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:47 compute-1 sudo[149646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-godteiqxhneirgvfspnbmmohgfalaqze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162626.8766367-847-228772451783785/AnsiballZ_getent.py'
Jan 23 10:03:47 compute-1 sudo[149646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:47 compute-1 python3.9[149648]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Jan 23 10:03:47 compute-1 ceph-mon[80126]: pgmap v307: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:03:47 compute-1 sudo[149646]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:47.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:47.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:48 compute-1 sudo[149800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nwvniughtpeciqeodlqpynpyzkhpiroy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162627.747097-871-130618420371443/AnsiballZ_group.py'
Jan 23 10:03:48 compute-1 sudo[149800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:48 compute-1 python3.9[149802]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 10:03:48 compute-1 groupadd[149803]: group added to /etc/group: name=libvirt, GID=42473
Jan 23 10:03:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:48 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:48 compute-1 groupadd[149803]: group added to /etc/gshadow: name=libvirt
Jan 23 10:03:48 compute-1 groupadd[149803]: new group: name=libvirt, GID=42473
Jan 23 10:03:48 compute-1 sudo[149800]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:49 compute-1 sudo[149958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jytqokwclgkrzikynsljpctwnvhcdqla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162628.8144712-895-203328968610932/AnsiballZ_user.py'
Jan 23 10:03:49 compute-1 sudo[149958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:49 compute-1 python3.9[149960]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 10:03:49 compute-1 ceph-mon[80126]: pgmap v308: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:49 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:03:49 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:03:49 compute-1 useradd[149963]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Jan 23 10:03:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:49.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:49 compute-1 sudo[149958]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:03:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:49.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:03:50 compute-1 sudo[150120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uvcbjfjtdklgwenblehtsudrcfrgsbqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162630.1322372-928-2152591582463/AnsiballZ_setup.py'
Jan 23 10:03:50 compute-1 sudo[150120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:03:50 compute-1 python3.9[150122]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 10:03:51 compute-1 sudo[150120]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:51 compute-1 sudo[150205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qyjslkuvowxtnspcvxfdkylwbalcjjim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162630.1322372-928-2152591582463/AnsiballZ_dnf.py'
Jan 23 10:03:51 compute-1 sudo[150205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:03:51 compute-1 ceph-mon[80126]: pgmap v309: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:03:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:51.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:03:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:51 compute-1 python3.9[150207]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:03:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:51.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:52 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:52 compute-1 ceph-mon[80126]: pgmap v310: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:03:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:53.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:53.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100354 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:03:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:54 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:54 compute-1 podman[150216]: 2026-01-23 10:03:54.685781587 +0000 UTC m=+0.079505828 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 23 10:03:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:03:55.029 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:03:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:03:55.030 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:03:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:03:55.031 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:03:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:55 compute-1 ceph-mon[80126]: pgmap v311: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:03:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:55.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:03:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:55.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:56 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:57 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:57 compute-1 ceph-mon[80126]: pgmap v312: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:03:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:57.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:57 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:03:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:57.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:03:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a180 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:59 compute-1 sudo[150253]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:03:59 compute-1 sudo[150253]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:03:59 compute-1 sudo[150253]: pam_unix(sudo:session): session closed for user root
Jan 23 10:03:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100359 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 1ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:03:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:03:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:03:59.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:03:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:03:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:03:59 compute-1 ceph-mon[80126]: pgmap v313: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:03:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:03:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:03:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:03:59.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:00 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:00 compute-1 ceph-mon[80126]: pgmap v314: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:04:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:01.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:01.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:02 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c003da0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:04:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:03 compute-1 ceph-mon[80126]: pgmap v315: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:04:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:03.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:03.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:04 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:05 compute-1 ceph-mon[80126]: pgmap v316: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:04:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:04:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:05.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:05.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:04:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:04:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c001230 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:07 compute-1 ceph-mon[80126]: pgmap v317: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 10:04:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:07.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:07.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:08 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:04:09 compute-1 ceph-mon[80126]: pgmap v318: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 10:04:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:09.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002800 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:10 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:10 compute-1 podman[150449]: 2026-01-23 10:04:10.773885111 +0000 UTC m=+0.164532380 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible)
Jan 23 10:04:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:11 compute-1 ceph-mon[80126]: pgmap v319: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.5 KiB/s rd, 682 B/s wr, 2 op/s
Jan 23 10:04:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:11.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:11.843 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:12 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:04:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:12 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:13 compute-1 ceph-mon[80126]: pgmap v320: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:04:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:13.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:13.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:14 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002800 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:04:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:04:15 compute-1 ceph-mon[80126]: pgmap v321: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.6 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:04:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:15.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:15.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100416 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:04:16 compute-1 sshd-session[150478]: Invalid user sol from 45.148.10.240 port 58756
Jan 23 10:04:16 compute-1 sshd-session[150478]: Connection closed by invalid user sol 45.148.10.240 port 58756 [preauth]
Jan 23 10:04:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:16 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:16 compute-1 ceph-mon[80126]: pgmap v322: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.6 KiB/s rd, 1.7 KiB/s wr, 5 op/s
Jan 23 10:04:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:17.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:17.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:18 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:04:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:18 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:18 compute-1 kernel: SELinux:  Converting 2779 SID table entries...
Jan 23 10:04:18 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 10:04:18 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 23 10:04:18 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 10:04:18 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 23 10:04:18 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 10:04:18 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 10:04:18 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 10:04:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:19 compute-1 ceph-mon[80126]: pgmap v323: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Jan 23 10:04:19 compute-1 sudo[150488]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:04:19 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Jan 23 10:04:19 compute-1 sudo[150488]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:19 compute-1 sudo[150488]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:04:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:19.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:04:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:04:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:19.856 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:04:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:04:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:20 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:21 compute-1 ceph-mon[80126]: pgmap v324: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Jan 23 10:04:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100421 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:04:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:21.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:21.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:22 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:23 compute-1 ceph-mon[80126]: pgmap v325: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.6 KiB/s rd, 1.1 KiB/s wr, 3 op/s
Jan 23 10:04:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:23.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:23.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:24 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:25 compute-1 ceph-mon[80126]: pgmap v326: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Jan 23 10:04:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:04:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:25.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:04:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:25 compute-1 podman[150518]: 2026-01-23 10:04:25.742343689 +0000 UTC m=+0.100987185 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:04:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:25.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:26 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:26 compute-1 sudo[150537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:04:26 compute-1 sudo[150537]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:26 compute-1 sudo[150537]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:26 compute-1 sudo[150562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:04:26 compute-1 sudo[150562]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:27 compute-1 ceph-mon[80126]: pgmap v327: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Jan 23 10:04:27 compute-1 sudo[150562]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:27.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:27.868 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 10:04:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 10:04:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:28 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c50 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:29 compute-1 kernel: SELinux:  Converting 2779 SID table entries...
Jan 23 10:04:29 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 10:04:29 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 23 10:04:29 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 10:04:29 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 23 10:04:29 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 10:04:29 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 10:04:29 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 10:04:29 compute-1 ceph-mon[80126]: pgmap v328: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:04:29 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:29 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:04:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:29.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:04:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:29.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 10:04:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:04:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:04:30 compute-1 ceph-mon[80126]: pgmap v329: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:04:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:04:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:04:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:04:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:31.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:31.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:32 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340093d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:33 compute-1 ceph-mon[80126]: pgmap v330: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:04:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.003000097s ======
Jan 23 10:04:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:33.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000097s
Jan 23 10:04:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:33.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:34 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:35 compute-1 ceph-mon[80126]: pgmap v331: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:04:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:04:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:35.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:04:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340093d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:35 compute-1 sudo[150631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:04:35 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Jan 23 10:04:35 compute-1 sudo[150631]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:35.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:35 compute-1 sudo[150631]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:36 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:04:36 compute-1 ceph-mon[80126]: pgmap v332: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:37.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:37.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:38 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340093d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:39 compute-1 sudo[150657]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:04:39 compute-1 sudo[150657]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:39 compute-1 sudo[150657]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:39.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000065s ======
Jan 23 10:04:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:39.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Jan 23 10:04:40 compute-1 ceph-mon[80126]: pgmap v333: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:40 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80340093d0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:41.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:41 compute-1 podman[150684]: 2026-01-23 10:04:41.712785226 +0000 UTC m=+0.104185878 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Jan 23 10:04:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:41.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:42 compute-1 ceph-mon[80126]: pgmap v334: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:42 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c70 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:43.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:43 compute-1 ceph-mon[80126]: pgmap v335: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:04:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:43.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:44 compute-1 ceph-mon[80126]: pgmap v336: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003c90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:04:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:45.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:04:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:45.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:46 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:47 compute-1 ceph-mon[80126]: pgmap v337: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:47.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003cb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:47.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:48 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:49 compute-1 ceph-mon[80126]: pgmap v338: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001380 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:49.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:49.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:04:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003cd0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:51 compute-1 ceph-mon[80126]: pgmap v339: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:51.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:51.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:52 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:53 compute-1 ceph-mon[80126]: pgmap v340: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:04:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:53.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:53.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:54 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:04:55.030 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:04:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:04:55.031 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:04:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:04:55.031 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:04:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003cf0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:55 compute-1 ceph-mon[80126]: pgmap v341: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:55.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:55.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:04:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:56 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8018003ed0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:56 compute-1 podman[158885]: 2026-01-23 10:04:56.668430653 +0000 UTC m=+0.070349665 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 10:04:56 compute-1 ceph-mon[80126]: pgmap v342: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:57 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:57.699 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:57 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003d10 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:04:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:57.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:58 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:59 compute-1 ceph-mon[80126]: pgmap v343: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:04:59 compute-1 sudo[160493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:04:59 compute-1 sudo[160493]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:04:59 compute-1 sudo[160493]: pam_unix(sudo:session): session closed for user root
Jan 23 10:04:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:04:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:04:59.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:04:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:04:59 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:04:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:04:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:04:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:04:59.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:00 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:01 compute-1 anacron[2198]: Job `cron.weekly' started
Jan 23 10:05:01 compute-1 anacron[2198]: Job `cron.weekly' terminated
Jan 23 10:05:01 compute-1 ceph-mon[80126]: pgmap v344: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:01.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:01 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:01.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:02 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8034001520 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:03 compute-1 ceph-mon[80126]: pgmap v345: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:05:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:03.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:03 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:03.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:04 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f80380013a0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:05.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:05 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:05:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:05.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:05:06 compute-1 ceph-mon[80126]: pgmap v346: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:05:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:06 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f800c000d00 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:07 compute-1 ceph-mon[80126]: pgmap v347: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:07.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:07 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:07.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:08 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:08 compute-1 ceph-mon[80126]: pgmap v348: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:09 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:09.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:09.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:10 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:10 compute-1 ceph-mon[80126]: pgmap v349: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f800c001820 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:11 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:11.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:05:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:11.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:05:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:12 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038001eb0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:12 compute-1 podman[167485]: 2026-01-23 10:05:12.707724302 +0000 UTC m=+0.105029295 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:05:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:13 compute-1 ceph-mon[80126]: pgmap v350: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:05:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:13 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:13.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:13.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:14 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:14 compute-1 ceph-mon[80126]: pgmap v351: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:15 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.003000097s ======
Jan 23 10:05:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:15.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000097s
Jan 23 10:05:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:15.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:16 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:17 compute-1 ceph-mon[80126]: pgmap v352: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:17 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:17.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:17.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100518 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:05:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:18 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:19 compute-1 ceph-mon[80126]: pgmap v353: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:19 compute-1 sudo[167674]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:05:19 compute-1 sudo[167674]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:19 compute-1 sudo[167674]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:19 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:19.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:19.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:05:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:20 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038003340 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:21 compute-1 ceph-mon[80126]: pgmap v354: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:21 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:21.743 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:21.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:22 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038004440 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:23 compute-1 ceph-mon[80126]: pgmap v355: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:05:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:23 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:23.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:23.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:24 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:25 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038004440 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:05:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:25.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:05:25 compute-1 ceph-mon[80126]: pgmap v356: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:05:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:25.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:26 compute-1 kernel: SELinux:  Converting 2780 SID table entries...
Jan 23 10:05:26 compute-1 kernel: SELinux:  policy capability network_peer_controls=1
Jan 23 10:05:26 compute-1 kernel: SELinux:  policy capability open_perms=1
Jan 23 10:05:26 compute-1 kernel: SELinux:  policy capability extended_socket_class=1
Jan 23 10:05:26 compute-1 kernel: SELinux:  policy capability always_check_network=0
Jan 23 10:05:26 compute-1 kernel: SELinux:  policy capability cgroup_seclabel=1
Jan 23 10:05:26 compute-1 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Jan 23 10:05:26 compute-1 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Jan 23 10:05:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:26 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:26 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Jan 23 10:05:26 compute-1 ceph-mon[80126]: pgmap v357: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:26 compute-1 podman[167710]: 2026-01-23 10:05:26.947191901 +0000 UTC m=+0.062311076 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 10:05:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:26 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:05:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:27 compute-1 groupadd[167733]: group added to /etc/group: name=dnsmasq, GID=993
Jan 23 10:05:27 compute-1 groupadd[167733]: group added to /etc/gshadow: name=dnsmasq
Jan 23 10:05:27 compute-1 groupadd[167733]: new group: name=dnsmasq, GID=993
Jan 23 10:05:27 compute-1 useradd[167740]: new user: name=dnsmasq, UID=992, GID=993, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Jan 23 10:05:27 compute-1 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 10:05:27 compute-1 dbus-broker-launch[753]: Noticed file-system modification, trigger reload.
Jan 23 10:05:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:27 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:27.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:27.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:28 compute-1 groupadd[167754]: group added to /etc/group: name=clevis, GID=992
Jan 23 10:05:28 compute-1 groupadd[167754]: group added to /etc/gshadow: name=clevis
Jan 23 10:05:28 compute-1 groupadd[167754]: new group: name=clevis, GID=992
Jan 23 10:05:28 compute-1 useradd[167761]: new user: name=clevis, UID=991, GID=992, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Jan 23 10:05:28 compute-1 usermod[167771]: add 'clevis' to group 'tss'
Jan 23 10:05:28 compute-1 usermod[167771]: add 'clevis' to shadow group 'tss'
Jan 23 10:05:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:28 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8038004440 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:29 compute-1 ceph-mon[80126]: pgmap v358: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:29 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:29.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:29.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:05:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:05:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:30 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:30 compute-1 ceph-mon[80126]: pgmap v359: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 596 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:31 compute-1 polkitd[43458]: Reloading rules
Jan 23 10:05:31 compute-1 polkitd[43458]: Collecting garbage unconditionally...
Jan 23 10:05:31 compute-1 polkitd[43458]: Loading rules from directory /etc/polkit-1/rules.d
Jan 23 10:05:31 compute-1 polkitd[43458]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 23 10:05:31 compute-1 polkitd[43458]: Finished loading, compiling and executing 3 rules
Jan 23 10:05:31 compute-1 polkitd[43458]: Reloading rules
Jan 23 10:05:31 compute-1 polkitd[43458]: Collecting garbage unconditionally...
Jan 23 10:05:31 compute-1 polkitd[43458]: Loading rules from directory /etc/polkit-1/rules.d
Jan 23 10:05:31 compute-1 polkitd[43458]: Loading rules from directory /usr/share/polkit-1/rules.d
Jan 23 10:05:31 compute-1 polkitd[43458]: Finished loading, compiling and executing 3 rules
Jan 23 10:05:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:31 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:31.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:31.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:32 compute-1 groupadd[167963]: group added to /etc/group: name=ceph, GID=167
Jan 23 10:05:32 compute-1 groupadd[167963]: group added to /etc/gshadow: name=ceph
Jan 23 10:05:32 compute-1 groupadd[167963]: new group: name=ceph, GID=167
Jan 23 10:05:32 compute-1 useradd[167969]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Jan 23 10:05:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:32 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c002d30 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:05:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f800c0028c0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:33 compute-1 ceph-mon[80126]: pgmap v360: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:05:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:33 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:33.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:33.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:34 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:35 compute-1 sshd[1007]: Received signal 15; terminating.
Jan 23 10:05:35 compute-1 systemd[1]: Stopping OpenSSH server daemon...
Jan 23 10:05:35 compute-1 systemd[1]: sshd.service: Deactivated successfully.
Jan 23 10:05:35 compute-1 systemd[1]: Stopped OpenSSH server daemon.
Jan 23 10:05:35 compute-1 systemd[1]: sshd.service: Consumed 3.108s CPU time, read 32.0K from disk, written 80.0K to disk.
Jan 23 10:05:35 compute-1 systemd[1]: Stopped target sshd-keygen.target.
Jan 23 10:05:35 compute-1 systemd[1]: Stopping sshd-keygen.target...
Jan 23 10:05:35 compute-1 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 10:05:35 compute-1 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 10:05:35 compute-1 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Jan 23 10:05:35 compute-1 systemd[1]: Reached target sshd-keygen.target.
Jan 23 10:05:35 compute-1 systemd[1]: Starting OpenSSH server daemon...
Jan 23 10:05:35 compute-1 sshd[168617]: Server listening on 0.0.0.0 port 22.
Jan 23 10:05:35 compute-1 sshd[168617]: Server listening on :: port 22.
Jan 23 10:05:35 compute-1 systemd[1]: Started OpenSSH server daemon.
Jan 23 10:05:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:35 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:35.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:05:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:35.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:05:36 compute-1 sudo[168666]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:05:36 compute-1 sudo[168666]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:36 compute-1 sudo[168666]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:36 compute-1 sudo[168700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 10:05:36 compute-1 sudo[168700]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:36 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:36 compute-1 ceph-mon[80126]: pgmap v361: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:05:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:05:36 compute-1 podman[168873]: 2026-01-23 10:05:36.729640513 +0000 UTC m=+0.067186342 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 10:05:36 compute-1 podman[168873]: 2026-01-23 10:05:36.832918731 +0000 UTC m=+0.170464560 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Jan 23 10:05:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:37 compute-1 podman[169050]: 2026-01-23 10:05:37.292685156 +0000 UTC m=+0.064757655 container exec 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:05:37 compute-1 podman[169050]: 2026-01-23 10:05:37.308832413 +0000 UTC m=+0.080904862 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:05:37 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 10:05:37 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 23 10:05:37 compute-1 ceph-mon[80126]: pgmap v362: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:05:37 compute-1 systemd[1]: Reloading.
Jan 23 10:05:37 compute-1 podman[169179]: 2026-01-23 10:05:37.728562796 +0000 UTC m=+0.077536674 container exec 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 10:05:37 compute-1 systemd-rc-local-generator[169220]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:37 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:37 compute-1 systemd-sysv-generator[169226]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:37.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:37 compute-1 podman[169179]: 2026-01-23 10:05:37.778899088 +0000 UTC m=+0.127872906 container exec_died 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20250325, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:05:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:37 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 10:05:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:37.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:38 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:39 compute-1 sudo[171272]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:05:39 compute-1 sudo[171272]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:39 compute-1 sudo[171272]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:39 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:39.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:05:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:39.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:05:40 compute-1 ceph-mon[80126]: pgmap v363: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:05:40 compute-1 podman[169524]: 2026-01-23 10:05:40.04484279 +0000 UTC m=+1.955055936 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 10:05:40 compute-1 podman[169524]: 2026-01-23 10:05:40.053860979 +0000 UTC m=+1.964074155 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 10:05:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100540 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:05:40 compute-1 podman[172103]: 2026-01-23 10:05:40.307545373 +0000 UTC m=+0.056639645 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, com.redhat.component=keepalived-container, release=1793, summary=Provides keepalived on RHEL 9 for Ceph., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., name=keepalived, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=2.2.4, build-date=2023-02-22T09:23:20, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.tags=Ceph keepalived, io.openshift.expose-services=)
Jan 23 10:05:40 compute-1 podman[172103]: 2026-01-23 10:05:40.323827625 +0000 UTC m=+0.072921847 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, vcs-type=git, description=keepalived for Ceph, io.buildah.version=1.28.2, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, name=keepalived, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2023-02-22T09:23:20, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=keepalived-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=2.2.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793)
Jan 23 10:05:40 compute-1 sudo[168700]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:40 compute-1 sudo[172353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:05:40 compute-1 sudo[172353]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:40 compute-1 sudo[172353]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:40 compute-1 sudo[172459]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:05:40 compute-1 sudo[172459]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:40 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:40 compute-1 sudo[150205]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:41 compute-1 sudo[172459]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:41 compute-1 ceph-mon[80126]: pgmap v364: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:05:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:05:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:05:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:05:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:05:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:05:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:41 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:05:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:41.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:05:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:41.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:42 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:43 compute-1 ceph-mon[80126]: pgmap v365: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:05:43 compute-1 podman[175760]: 2026-01-23 10:05:43.678224256 +0000 UTC m=+0.085773777 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 10:05:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:43 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:05:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:43.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:05:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:43.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:44 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014001090 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:45 compute-1 ceph-mon[80126]: pgmap v366: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:45 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:45.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:45 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 10:05:45 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 23 10:05:45 compute-1 systemd[1]: man-db-cache-update.service: Consumed 10.558s CPU time.
Jan 23 10:05:45 compute-1 systemd[1]: run-r69442ff605504ec99d0d39e9e17d0e5a.service: Deactivated successfully.
Jan 23 10:05:45 compute-1 sudo[177894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:05:45 compute-1 sudo[177894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:45 compute-1 sudo[177894]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:45.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:46 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:46 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:46 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:05:46 compute-1 ceph-mon[80126]: pgmap v367: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:05:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:47 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:47.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:48.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:48 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:49 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:49.789 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:50.006 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:50 compute-1 ceph-mon[80126]: pgmap v368: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:05:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:05:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:50 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:51 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:51 compute-1 ceph-mon[80126]: pgmap v369: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:05:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:05:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:51.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:05:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:52.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:52 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:52 compute-1 sudo[178048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ozhkyaodvhlesvxyummiyvvptqydbuat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162752.0605614-964-10359016572245/AnsiballZ_systemd.py'
Jan 23 10:05:52 compute-1 sudo[178048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:52 compute-1 python3.9[178050]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:05:53 compute-1 systemd[1]: Reloading.
Jan 23 10:05:53 compute-1 systemd-rc-local-generator[178080]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:53 compute-1 systemd-sysv-generator[178083]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8014003390 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:53 compute-1 ceph-mon[80126]: pgmap v370: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:05:53 compute-1 sudo[178048]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:53 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:53.796 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:53 compute-1 sudo[178239]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aisawsdjxwjovevzdnggunhheezphjqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162753.5494807-964-88771969397536/AnsiballZ_systemd.py'
Jan 23 10:05:53 compute-1 sudo[178239]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:54.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:54 compute-1 python3.9[178241]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:05:54 compute-1 systemd[1]: Reloading.
Jan 23 10:05:54 compute-1 systemd-sysv-generator[178273]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:54 compute-1 systemd-rc-local-generator[178269]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:54 compute-1 sudo[178239]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:54 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f803400a690 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:05:55.030 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:05:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:05:55.031 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:05:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:05:55.032 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:05:55 compute-1 sudo[178429]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xpgfdbpppxjhqdrgzvosgzyvvcntlrdd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162754.7489603-964-191291373430141/AnsiballZ_systemd.py'
Jan 23 10:05:55 compute-1 sudo[178429]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:55 compute-1 ceph-mon[80126]: pgmap v371: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:55 compute-1 python3.9[178431]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:05:55 compute-1 systemd[1]: Reloading.
Jan 23 10:05:55 compute-1 systemd-rc-local-generator[178463]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:55 compute-1 systemd-sysv-generator[178467]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:55 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f801c003e90 fd 48 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:05:55 compute-1 sudo[178429]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:55.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:05:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:56.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:05:56 compute-1 sudo[178621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mxfikqugthyvibqbzpoidffdmfqdhdhn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162755.8806117-964-167025175832193/AnsiballZ_systemd.py'
Jan 23 10:05:56 compute-1 sudo[178621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:56 compute-1 python3.9[178623]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:05:56 compute-1 systemd[1]: Reloading.
Jan 23 10:05:56 compute-1 systemd-sysv-generator[178656]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:56 compute-1 systemd-rc-local-generator[178651]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[142294]: 23/01/2026 10:05:56 : epoch 69734744 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f802c004cc0 fd 48 proxy ignored for local
Jan 23 10:05:56 compute-1 kernel: ganesha.nfsd[143339]: segfault at 50 ip 00007f80bffe732e sp 00007f80427fb210 error 4 in libntirpc.so.5.8[7f80bffcc000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 10:05:56 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:05:56 compute-1 systemd[1]: Started Process Core Dump (PID 178660/UID 0).
Jan 23 10:05:56 compute-1 sudo[178621]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:57 compute-1 ceph-mon[80126]: pgmap v372: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:05:57 compute-1 sudo[178829]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yddvkuildldwhnueulnbdbbvqkgnooqg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162757.1049771-1051-219503202328557/AnsiballZ_systemd.py'
Jan 23 10:05:57 compute-1 sudo[178829]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:57 compute-1 podman[178786]: 2026-01-23 10:05:57.427129599 +0000 UTC m=+0.067893584 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:05:57 compute-1 python3.9[178834]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:05:57 compute-1 systemd[1]: Reloading.
Jan 23 10:05:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:05:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:57.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:57 compute-1 systemd-rc-local-generator[178863]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:57 compute-1 systemd-sysv-generator[178867]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:57 compute-1 systemd-coredump[178662]: Process 142298 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 43:
                                                    #0  0x00007f80bffe732e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:05:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:05:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:05:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:05:58 compute-1 podman[178875]: 2026-01-23 10:05:58.075266481 +0000 UTC m=+0.026826830 container died 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default)
Jan 23 10:05:58 compute-1 podman[178875]: 2026-01-23 10:05:58.114890478 +0000 UTC m=+0.066450807 container remove 8129fb52cb4700c7a5e5ad6cc05cfa9b61e53ebacb6f78fbe5debd8e7ffa2afb (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1)
Jan 23 10:05:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-a861562c25d86152e9009e5dab6d40213d6e77dcc5a10ba1fd5e7b9314417005-merged.mount: Deactivated successfully.
Jan 23 10:05:58 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:05:58 compute-1 systemd[1]: systemd-coredump@6-178660-0.service: Deactivated successfully.
Jan 23 10:05:58 compute-1 systemd[1]: systemd-coredump@6-178660-0.service: Consumed 1.094s CPU time.
Jan 23 10:05:58 compute-1 sudo[178829]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:58 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 10:05:58 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.656s CPU time.
Jan 23 10:05:58 compute-1 sudo[179069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjtwlsyycbqtwsxqfrnoaiwcshuhebbb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162758.2937784-1051-29339939715070/AnsiballZ_systemd.py'
Jan 23 10:05:58 compute-1 sudo[179069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:05:58 compute-1 python3.9[179071]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:05:59 compute-1 systemd[1]: Reloading.
Jan 23 10:05:59 compute-1 systemd-rc-local-generator[179100]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:05:59 compute-1 systemd-sysv-generator[179105]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:05:59 compute-1 ceph-mon[80126]: pgmap v373: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:05:59 compute-1 sudo[179069]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:59 compute-1 sudo[179211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:05:59 compute-1 sudo[179211]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:05:59 compute-1 sudo[179211]: pam_unix(sudo:session): session closed for user root
Jan 23 10:05:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:05:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:05:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:05:59.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:05:59 compute-1 sudo[179286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cnjavhhjgzgtkvrvcbuwjnsyrybrsclq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162759.568476-1051-120158656206331/AnsiballZ_systemd.py'
Jan 23 10:05:59 compute-1 sudo[179286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:00.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:00 compute-1 python3.9[179288]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:00 compute-1 systemd[1]: Reloading.
Jan 23 10:06:00 compute-1 systemd-rc-local-generator[179318]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:06:00 compute-1 systemd-sysv-generator[179321]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:06:00 compute-1 sudo[179286]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:01 compute-1 sudo[179476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-twiyvpjxmaolxigtbgofdtpznewcbnqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162760.7136266-1051-272051328051532/AnsiballZ_systemd.py'
Jan 23 10:06:01 compute-1 sudo[179476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:01 compute-1 python3.9[179478]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:01 compute-1 ceph-mon[80126]: pgmap v374: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:01 compute-1 sudo[179476]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:01.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:01 compute-1 sudo[179632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zxvodqvvuwowjbbpeykkujgqxsibwreb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162761.5603876-1051-238337588091655/AnsiballZ_systemd.py'
Jan 23 10:06:01 compute-1 sudo[179632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:02.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:02 compute-1 python3.9[179634]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:02 compute-1 systemd[1]: Reloading.
Jan 23 10:06:02 compute-1 systemd-rc-local-generator[179663]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:06:02 compute-1 systemd-sysv-generator[179666]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:06:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100602 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:06:02 compute-1 sudo[179632]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:03 compute-1 sudo[179822]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mnjmqucsqjyfcspivmtznlwosafmoxnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162762.8078105-1159-76535133324609/AnsiballZ_systemd.py'
Jan 23 10:06:03 compute-1 sudo[179822]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:03 compute-1 python3.9[179824]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Jan 23 10:06:03 compute-1 ceph-mon[80126]: pgmap v375: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:06:03 compute-1 systemd[1]: Reloading.
Jan 23 10:06:03 compute-1 systemd-sysv-generator[179859]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:06:03 compute-1 systemd-rc-local-generator[179856]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:06:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:03.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:03 compute-1 systemd[1]: Listening on libvirt proxy daemon socket.
Jan 23 10:06:03 compute-1 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Jan 23 10:06:03 compute-1 sudo[179822]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:04.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:04 compute-1 sudo[180016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-syxkeevmrthjwrizsvkdniefpdjnbdoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162764.0641394-1183-258356698564126/AnsiballZ_systemd.py'
Jan 23 10:06:04 compute-1 sudo[180016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:04 compute-1 python3.9[180018]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:04 compute-1 sudo[180016]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:05 compute-1 sudo[180171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zormfjicfdunqbzadmoggebrmqksqnqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162764.9404707-1183-135642817370176/AnsiballZ_systemd.py'
Jan 23 10:06:05 compute-1 sudo[180171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:05 compute-1 python3.9[180173]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:05 compute-1 ceph-mon[80126]: pgmap v376: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:06:05 compute-1 sudo[180171]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:05.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:06.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:06 compute-1 sudo[180327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oifpfthdvtblhejlduwaxpbqbacmhrev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162765.7179475-1183-123010174003231/AnsiballZ_systemd.py'
Jan 23 10:06:06 compute-1 sudo[180327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:06 compute-1 python3.9[180329]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:07 compute-1 sudo[180327]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:07 compute-1 ceph-mon[80126]: pgmap v377: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:06:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:07.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:07 compute-1 sudo[180483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcgarrvcwbgkeapjgjdjdbndozupuukv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162767.6099355-1183-4647158864242/AnsiballZ_systemd.py'
Jan 23 10:06:07 compute-1 sudo[180483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:08.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:08 compute-1 python3.9[180485]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:08 compute-1 sudo[180483]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:08 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 7.
Jan 23 10:06:08 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:06:08 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.656s CPU time.
Jan 23 10:06:08 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:06:08 compute-1 podman[180607]: 2026-01-23 10:06:08.593704612 +0000 UTC m=+0.081602292 container create 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 10:06:08 compute-1 podman[180607]: 2026-01-23 10:06:08.539388644 +0000 UTC m=+0.027286344 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:06:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:06:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:06:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:06:08 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:06:08 compute-1 podman[180607]: 2026-01-23 10:06:08.667840385 +0000 UTC m=+0.155738095 container init 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 10:06:08 compute-1 podman[180607]: 2026-01-23 10:06:08.674542779 +0000 UTC m=+0.162440449 container start 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:06:08 compute-1 bash[180607]: 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da
Jan 23 10:06:08 compute-1 ceph-mon[80126]: pgmap v378: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:06:08 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:06:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:06:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:06:08 compute-1 sudo[180701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ncylvdsdxuemjfzsrqtojuqgugacwqvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162768.4520376-1183-11774835062825/AnsiballZ_systemd.py'
Jan 23 10:06:08 compute-1 sudo[180701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:06:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:06:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:06:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:06:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:06:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:06:09 compute-1 python3.9[180718]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:09 compute-1 sudo[180701]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:09 compute-1 sudo[180894]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-buzvbyfjqjhogimpksojwxszrjhcvdvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162769.2079978-1183-127634239776584/AnsiballZ_systemd.py'
Jan 23 10:06:09 compute-1 sudo[180894]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:09 compute-1 python3.9[180896]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:09.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:09 compute-1 sudo[180894]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:10.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:10 compute-1 sudo[181049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqgahoobvouwdihefwstvdmkydmvmdzj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162770.053691-1183-7468798436048/AnsiballZ_systemd.py'
Jan 23 10:06:10 compute-1 sudo[181049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:10 compute-1 python3.9[181051]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:10 compute-1 sudo[181049]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:11 compute-1 sudo[181204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kabjjoexptbyocnwoddrwprwnndjcksr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162770.9925537-1183-121087961648182/AnsiballZ_systemd.py'
Jan 23 10:06:11 compute-1 sudo[181204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:11 compute-1 ceph-mon[80126]: pgmap v379: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:06:11 compute-1 python3.9[181206]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:11 compute-1 sudo[181204]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:11.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:12.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:12 compute-1 sudo[181360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-npsesonminnjmggycyfobbncwhnnpaaf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162771.8470905-1183-274758432673305/AnsiballZ_systemd.py'
Jan 23 10:06:12 compute-1 sudo[181360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:12 compute-1 python3.9[181362]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:12 compute-1 sudo[181360]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:12 compute-1 sudo[181515]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihalmalwukppgmiyztbcufhqwqqfjnrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162772.6382678-1183-53722463197176/AnsiballZ_systemd.py'
Jan 23 10:06:12 compute-1 sudo[181515]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:13 compute-1 ceph-mon[80126]: pgmap v380: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:13 compute-1 python3.9[181517]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:13 compute-1 sudo[181515]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:13.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:13 compute-1 sudo[181688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-usgnwvitorjubuxtkgdqjmzlvlromfer ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162773.564973-1183-112303398041904/AnsiballZ_systemd.py'
Jan 23 10:06:13 compute-1 sudo[181688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:13 compute-1 podman[181645]: 2026-01-23 10:06:13.987668612 +0000 UTC m=+0.133168634 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:06:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:14.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:14 compute-1 python3.9[181693]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:14 compute-1 sudo[181688]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:14 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:06:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:14 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:06:14 compute-1 sudo[181852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tnqeuycamxiitqftzvmtngskgtahrhei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162774.4591973-1183-214000386835761/AnsiballZ_systemd.py'
Jan 23 10:06:14 compute-1 sudo[181852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:14 compute-1 ceph-mon[80126]: pgmap v381: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:15 compute-1 python3.9[181854]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:15.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:16.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:16 compute-1 sudo[181852]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:16 compute-1 sudo[182008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-antaczapfzhucdubswbbnimdyqmlzkby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162776.5312233-1183-79271937815355/AnsiballZ_systemd.py'
Jan 23 10:06:16 compute-1 sudo[182008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:17 compute-1 python3.9[182010]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:17 compute-1 sudo[182008]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:17 compute-1 ceph-mon[80126]: pgmap v382: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:06:17 compute-1 sudo[182164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-liqjdjbyjbgshcdwfuymmjaackptgdqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162777.3175993-1183-205654333078926/AnsiballZ_systemd.py'
Jan 23 10:06:17 compute-1 sudo[182164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:17.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:17 compute-1 python3.9[182166]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Jan 23 10:06:17 compute-1 sudo[182164]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:18.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:18 compute-1 sudo[182319]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-opvbfxhnjqhcwxhcumeocyiqvrcuysnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162778.4212265-1489-150675622170840/AnsiballZ_file.py'
Jan 23 10:06:18 compute-1 sudo[182319]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:18 compute-1 python3.9[182321]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:18 compute-1 sudo[182319]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:19 compute-1 sudo[182471]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-suvzsqrlzdjuwgnurstzmarjrxqwfocw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162779.113263-1489-135503944748383/AnsiballZ_file.py'
Jan 23 10:06:19 compute-1 sudo[182471]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:19 compute-1 ceph-mon[80126]: pgmap v383: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:06:19 compute-1 python3.9[182474]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:19 compute-1 sudo[182471]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:19.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:19 compute-1 sudo[182551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:06:19 compute-1 sudo[182551]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:19 compute-1 sudo[182551]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:20.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:20 compute-1 sudo[182649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upkbylhpifuzoppctbpczumglddmamyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162779.7448568-1489-120031904313404/AnsiballZ_file.py'
Jan 23 10:06:20 compute-1 sudo[182649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:20 compute-1 python3.9[182651]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:20 compute-1 sudo[182649]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:06:20 compute-1 sudo[182801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tynvkxozojfjkogyshybijbkbocsxuao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162780.408663-1489-130662430801559/AnsiballZ_file.py'
Jan 23 10:06:20 compute-1 sudo[182801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:06:20 compute-1 python3.9[182803]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:20 compute-1 sudo[182801]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:06:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:06:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:21 compute-1 sudo[182970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilhldttslrqgsmbubmgthzdheogwtbfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162781.2065294-1489-127715676688801/AnsiballZ_file.py'
Jan 23 10:06:21 compute-1 sudo[182970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:21 compute-1 ceph-mon[80126]: pgmap v384: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:06:21 compute-1 python3.9[182972]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:21 compute-1 sudo[182970]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18001c40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:21.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:22.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:22 compute-1 sudo[183122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ikxgbvsjlzbjeejskksdxbinklqwfsav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162781.9063404-1489-121329693546010/AnsiballZ_file.py'
Jan 23 10:06:22 compute-1 sudo[183122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:22 compute-1 python3.9[183124]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:06:22 compute-1 sudo[183122]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:22 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:22 compute-1 ceph-mon[80126]: pgmap v385: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:06:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:23 compute-1 python3.9[183274]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:06:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:23.852 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:23 compute-1 sudo[183425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kbmcvalznkvxtbmycjchbqmtsyfryhsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162783.3902836-1642-68642680155090/AnsiballZ_stat.py'
Jan 23 10:06:23 compute-1 sudo[183425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:24.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:24 compute-1 python3.9[183427]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:24 compute-1 sudo[183425]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:24 compute-1 sudo[183550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wsjatadwwwnzqhuwflcttbzvodxddwzr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162783.3902836-1642-68642680155090/AnsiballZ_copy.py'
Jan 23 10:06:24 compute-1 sudo[183550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100624 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:06:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:24 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:24 compute-1 python3.9[183552]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162783.3902836-1642-68642680155090/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:24 compute-1 sudo[183550]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:25 compute-1 sudo[183702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zltlzhyhktytteanerkrkkhhxaazdugk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162784.9180737-1642-213915064466572/AnsiballZ_stat.py'
Jan 23 10:06:25 compute-1 sudo[183702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:25 compute-1 ceph-mon[80126]: pgmap v386: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:06:25 compute-1 python3.9[183704]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:25 compute-1 sudo[183702]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:25.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.880189) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785880315, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 4222, "num_deletes": 502, "total_data_size": 11737855, "memory_usage": 11936008, "flush_reason": "Manual Compaction"}
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785914772, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4407138, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13045, "largest_seqno": 17262, "table_properties": {"data_size": 4395820, "index_size": 6404, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3845, "raw_key_size": 30531, "raw_average_key_size": 19, "raw_value_size": 4368991, "raw_average_value_size": 2857, "num_data_blocks": 279, "num_entries": 1529, "num_filter_entries": 1529, "num_deletions": 502, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162371, "oldest_key_time": 1769162371, "file_creation_time": 1769162785, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 34630 microseconds, and 10807 cpu microseconds.
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.914863) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4407138 bytes OK
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.914904) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.916406) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.916468) EVENT_LOG_v1 {"time_micros": 1769162785916458, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.916491) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 11719031, prev total WAL file size 11719031, number of live WAL files 2.
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.919216) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400323532' seq:72057594037927935, type:22 .. '6D67727374617400353034' seq:0, type:0; will stop at (end)
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4303KB)], [27(12MB)]
Jan 23 10:06:25 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162785919321, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 17631132, "oldest_snapshot_seqno": -1}
Jan 23 10:06:25 compute-1 sudo[183828]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-racicrjnujldbirqmedpemxmscypujqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162784.9180737-1642-213915064466572/AnsiballZ_copy.py'
Jan 23 10:06:25 compute-1 sudo[183828]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 4981 keys, 13174592 bytes, temperature: kUnknown
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786018953, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 13174592, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13139517, "index_size": 21525, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12485, "raw_key_size": 124878, "raw_average_key_size": 25, "raw_value_size": 13047188, "raw_average_value_size": 2619, "num_data_blocks": 899, "num_entries": 4981, "num_filter_entries": 4981, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162785, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.019217) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 13174592 bytes
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.020349) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 176.9 rd, 132.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.2, 12.6 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(7.0) write-amplify(3.0) OK, records in: 5809, records dropped: 828 output_compression: NoCompression
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.020373) EVENT_LOG_v1 {"time_micros": 1769162786020360, "job": 14, "event": "compaction_finished", "compaction_time_micros": 99694, "compaction_time_cpu_micros": 30705, "output_level": 6, "num_output_files": 1, "total_output_size": 13174592, "num_input_records": 5809, "num_output_records": 4981, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786021311, "job": 14, "event": "table_file_deletion", "file_number": 29}
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162786023752, "job": 14, "event": "table_file_deletion", "file_number": 27}
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:25.919058) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023919) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023929) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023931) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023933) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:06:26.023935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:06:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:26.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:26 compute-1 python3.9[183830]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162784.9180737-1642-213915064466572/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:26 compute-1 sudo[183828]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:26 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:26 compute-1 sudo[183980]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rfubvyrmvjjekcnnxjksxkqwiaqkpbap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162786.3138676-1642-13326962495522/AnsiballZ_stat.py'
Jan 23 10:06:26 compute-1 sudo[183980]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:26 compute-1 python3.9[183982]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:26 compute-1 sudo[183980]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:26 compute-1 ceph-mon[80126]: pgmap v387: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:06:27 compute-1 sudo[184105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ejocqoqrvzyrtmdpifvpgpjbmpgsfuoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162786.3138676-1642-13326962495522/AnsiballZ_copy.py'
Jan 23 10:06:27 compute-1 sudo[184105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:27 compute-1 python3.9[184107]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162786.3138676-1642-13326962495522/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:27 compute-1 sudo[184105]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:27 compute-1 podman[184156]: 2026-01-23 10:06:27.653238979 +0000 UTC m=+0.051949213 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 10:06:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a140089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:27 compute-1 sudo[184277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eyoxqvbijtpfdsgobajsgargvgvbzkyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162787.5738187-1642-235121199264350/AnsiballZ_stat.py'
Jan 23 10:06:27 compute-1 sudo[184277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:27.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:28 compute-1 python3.9[184279]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:28.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:28 compute-1 sudo[184277]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:28 compute-1 sudo[184402]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udvjnfzwazlygtohjwgkthbkbnbphlpw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162787.5738187-1642-235121199264350/AnsiballZ_copy.py'
Jan 23 10:06:28 compute-1 sudo[184402]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:28 compute-1 python3.9[184404]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162787.5738187-1642-235121199264350/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:28 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:28 compute-1 sudo[184402]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:29 compute-1 sudo[184554]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlgptlundwmnxzsqxqtorvalejmuzwap ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162788.7690985-1642-16349019001806/AnsiballZ_stat.py'
Jan 23 10:06:29 compute-1 sudo[184554]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:29 compute-1 ceph-mon[80126]: pgmap v388: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:29 compute-1 python3.9[184556]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:29 compute-1 sudo[184554]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:29.862 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:29 compute-1 sudo[184680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mbrqnorpofxkkpoogvoruxraeeuvodzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162788.7690985-1642-16349019001806/AnsiballZ_copy.py'
Jan 23 10:06:29 compute-1 sudo[184680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:30.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:30 compute-1 python3.9[184682]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162788.7690985-1642-16349019001806/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:30 compute-1 sudo[184680]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:30 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a140089d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:30 compute-1 sudo[184832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-htsfitgjmgmclehfnxrvmmkjliuwkbdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162790.5594811-1642-38285098709651/AnsiballZ_stat.py'
Jan 23 10:06:30 compute-1 sudo[184832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:31 compute-1 python3.9[184834]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:31 compute-1 sudo[184832]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:31 compute-1 ceph-mon[80126]: pgmap v389: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:31 compute-1 sudo[184960]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mlvxjqwgsudepkvfmlovrrmoujvkswpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162790.5594811-1642-38285098709651/AnsiballZ_copy.py'
Jan 23 10:06:31 compute-1 sudo[184960]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:31 compute-1 sshd-session[184865]: Invalid user funded from 45.148.10.240 port 34210
Jan 23 10:06:31 compute-1 sshd-session[184865]: Connection closed by invalid user funded 45.148.10.240 port 34210 [preauth]
Jan 23 10:06:31 compute-1 python3.9[184962]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162790.5594811-1642-38285098709651/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:31 compute-1 sudo[184960]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:31.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000065s ======
Jan 23 10:06:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:32.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Jan 23 10:06:32 compute-1 sudo[185112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-patqijmghbhjbbdqvpkezyosiulrlfvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162791.9281554-1642-112199368998397/AnsiballZ_stat.py'
Jan 23 10:06:32 compute-1 sudo[185112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:32 compute-1 python3.9[185114]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:32 compute-1 sudo[185112]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:32 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:32 compute-1 sudo[185235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mjwbbfsfevmhsvjgrrmjzmrpbqbfxlnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162791.9281554-1642-112199368998397/AnsiballZ_copy.py'
Jan 23 10:06:32 compute-1 sudo[185235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:33 compute-1 python3.9[185237]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162791.9281554-1642-112199368998397/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:33 compute-1 sudo[185235]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a140096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:33 compute-1 ceph-mon[80126]: pgmap v390: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:06:33 compute-1 sudo[185388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gnnwelwyevuoacixezzpojqvucscetha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162793.2360907-1642-8608816156380/AnsiballZ_stat.py'
Jan 23 10:06:33 compute-1 sudo[185388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:33 compute-1 python3.9[185390]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:33 compute-1 sudo[185388]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:33.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:34 compute-1 sudo[185513]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-oqrnqaxqnkiyonmxippdkxlljkgbdaqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162793.2360907-1642-8608816156380/AnsiballZ_copy.py'
Jan 23 10:06:34 compute-1 sudo[185513]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:34.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:34 compute-1 python3.9[185515]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769162793.2360907-1642-8608816156380/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:34 compute-1 sudo[185513]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:34 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:35 compute-1 sudo[185665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vwmvzlquxnczecwotdgnozjsnbomlwao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162794.7059846-1981-104201530581584/AnsiballZ_command.py'
Jan 23 10:06:35 compute-1 sudo[185665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:35 compute-1 python3.9[185667]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Jan 23 10:06:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:35 compute-1 sudo[185665]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:35 compute-1 ceph-mon[80126]: pgmap v391: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:06:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a140096e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:35.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:35 compute-1 sudo[185819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-flclnysagklrfdptbepayipyrpguvpjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162795.6036391-2008-143908929325275/AnsiballZ_file.py'
Jan 23 10:06:35 compute-1 sudo[185819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:36 compute-1 python3.9[185821]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:36.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:36 compute-1 sudo[185819]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:36 compute-1 sudo[185971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpmnvsaaixknoqqyhqpzjqtzvahvwncj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162796.2815354-2008-35629242854257/AnsiballZ_file.py'
Jan 23 10:06:36 compute-1 sudo[185971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:36 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc0032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:36 compute-1 python3.9[185973]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:36 compute-1 sudo[185971]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:37 compute-1 sudo[186123]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ruwahfweofakscjwypiktllbhumwsdam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162796.886733-2008-229177125521534/AnsiballZ_file.py'
Jan 23 10:06:37 compute-1 sudo[186123]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:37 compute-1 python3.9[186125]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:37 compute-1 sudo[186123]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:37 compute-1 ceph-mon[80126]: pgmap v392: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:37.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:37 compute-1 sudo[186276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nmajdmqdgkeaqoovikswwnfuqgvwgnmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162797.5776708-2008-175641775329308/AnsiballZ_file.py'
Jan 23 10:06:37 compute-1 sudo[186276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:38.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:38 compute-1 python3.9[186278]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:38 compute-1 sudo[186276]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:38 compute-1 sudo[186428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbgvxewfmmmlscqefcwjntthhkuzvzml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162798.3052568-2008-106633853315850/AnsiballZ_file.py'
Jan 23 10:06:38 compute-1 sudo[186428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:38 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:38 compute-1 python3.9[186430]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:38 compute-1 sudo[186428]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:38 compute-1 ceph-mon[80126]: pgmap v393: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc0032f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:39 compute-1 sudo[186580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-psebpsxyexqxucnupolvayumbauttkyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162799.0232797-2008-10304604487386/AnsiballZ_file.py'
Jan 23 10:06:39 compute-1 sudo[186580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:39 compute-1 python3.9[186582]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:39 compute-1 sudo[186580]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:39.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:39 compute-1 sudo[186733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amooywzzzddearbzudfuaozvczkalelj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162799.6351106-2008-215850226041862/AnsiballZ_file.py'
Jan 23 10:06:39 compute-1 sudo[186733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:39 compute-1 sudo[186734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:06:39 compute-1 sudo[186734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:39 compute-1 sudo[186734]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:40.082 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:40 compute-1 python3.9[186740]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:40 compute-1 sudo[186733]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:40 compute-1 sudo[186910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbfibbopvcmfmpoaldrtepjpjvpwrpiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162800.250411-2008-222400841755724/AnsiballZ_file.py'
Jan 23 10:06:40 compute-1 sudo[186910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:40 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:40 compute-1 python3.9[186912]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:40 compute-1 sudo[186910]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:41 compute-1 sudo[187062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gvmxlstkcmzbymxpoymgtqxsmszvjfog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162800.924387-2008-91280220333292/AnsiballZ_file.py'
Jan 23 10:06:41 compute-1 sudo[187062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:41 compute-1 ceph-mon[80126]: pgmap v394: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:41 compute-1 python3.9[187064]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:41 compute-1 sudo[187062]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:41 compute-1 sudo[187215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gcqpkbdzjzbahipjahgblfvkydczevnh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162801.5702717-2008-83807823719998/AnsiballZ_file.py'
Jan 23 10:06:41 compute-1 sudo[187215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:41.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:42 compute-1 python3.9[187217]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:42 compute-1 sudo[187215]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:42.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:42 compute-1 sudo[187367]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aaxwxlxswntawdvhhbbmuyahthllyhpq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162802.2126362-2008-165570495751235/AnsiballZ_file.py'
Jan 23 10:06:42 compute-1 sudo[187367]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:42 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:42 compute-1 python3.9[187369]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:42 compute-1 sudo[187367]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:43 compute-1 ceph-mon[80126]: pgmap v395: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:06:43 compute-1 sudo[187520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-blyvlxqgsfpdnxynliyfsnxpkvlosuvy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162803.1704965-2008-101097225976578/AnsiballZ_file.py'
Jan 23 10:06:43 compute-1 sudo[187520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:43 compute-1 python3.9[187522]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:43 compute-1 sudo[187520]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:43.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:44.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:44 compute-1 sudo[187687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cmssttafvqzxkmpgojjdjaocqanplxrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162803.8793004-2008-72680617938367/AnsiballZ_file.py'
Jan 23 10:06:44 compute-1 sudo[187687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:44 compute-1 podman[187646]: 2026-01-23 10:06:44.215900651 +0000 UTC m=+0.091543880 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Jan 23 10:06:44 compute-1 python3.9[187692]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:44 compute-1 sudo[187687]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:44 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:44 compute-1 sudo[187850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jpfzhfftkmgypmfubgpldrwyfieububu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162804.5486717-2008-39693017329756/AnsiballZ_file.py'
Jan 23 10:06:44 compute-1 sudo[187850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:45 compute-1 python3.9[187852]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:45 compute-1 sudo[187850]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:45 compute-1 ceph-mon[80126]: pgmap v396: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:45 compute-1 sudo[188003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jloimqvbdknabdhpzqmiytgekoitwair ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162805.293492-2305-269383149811741/AnsiballZ_stat.py'
Jan 23 10:06:45 compute-1 sudo[188003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:45 compute-1 python3.9[188005]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:45 compute-1 sudo[188003]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:45.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:46.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:46 compute-1 sudo[188126]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rqfrujtuotdrijjfcizvimxufibdjegv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162805.293492-2305-269383149811741/AnsiballZ_copy.py'
Jan 23 10:06:46 compute-1 sudo[188126]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:46 compute-1 sudo[188127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:06:46 compute-1 sudo[188127]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:46 compute-1 sudo[188127]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:46 compute-1 sudo[188154]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:06:46 compute-1 sudo[188154]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:46 compute-1 python3.9[188136]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162805.293492-2305-269383149811741/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:46 compute-1 sudo[188126]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:46 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:46 compute-1 sudo[188154]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:46 compute-1 sudo[188360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-keykhzgrrazatcdhmoexmugtebasuort ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162806.4529254-2305-211999890843799/AnsiballZ_stat.py'
Jan 23 10:06:46 compute-1 sudo[188360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:46 compute-1 python3.9[188362]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:46 compute-1 sudo[188360]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:47 compute-1 sudo[188483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ctnayqpwzqafwvojmrpddflapmfxmjyw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162806.4529254-2305-211999890843799/AnsiballZ_copy.py'
Jan 23 10:06:47 compute-1 sudo[188483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:47 compute-1 python3.9[188485]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162806.4529254-2305-211999890843799/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:47 compute-1 sudo[188483]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:47 compute-1 ceph-mon[80126]: pgmap v397: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:47.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:47 compute-1 sudo[188636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dewpgkixsvynymqmrvaqosizwkxfguww ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162807.6888402-2305-95273635115613/AnsiballZ_stat.py'
Jan 23 10:06:47 compute-1 sudo[188636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:48.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:48 compute-1 python3.9[188638]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:48 compute-1 sudo[188636]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:48 compute-1 sudo[188759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuufhokucpudkmywqrwnznagxfzrewls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162807.6888402-2305-95273635115613/AnsiballZ_copy.py'
Jan 23 10:06:48 compute-1 sudo[188759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:48 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:48 compute-1 python3.9[188761]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162807.6888402-2305-95273635115613/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:48 compute-1 sudo[188759]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:49 compute-1 sudo[188911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ganekjnkvgcuhjpkgwvhpkdxbxhewgja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162808.8653703-2305-36663104291640/AnsiballZ_stat.py'
Jan 23 10:06:49 compute-1 sudo[188911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:49 compute-1 python3.9[188913]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:49 compute-1 sudo[188911]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:49 compute-1 ceph-mon[80126]: pgmap v398: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100649 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:06:49 compute-1 sudo[189035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qipixkuamuhrzjxsmwxgvxczzxcnnghz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162808.8653703-2305-36663104291640/AnsiballZ_copy.py'
Jan 23 10:06:49 compute-1 sudo[189035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:49 compute-1 python3.9[189037]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162808.8653703-2305-36663104291640/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:06:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:49.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:06:49 compute-1 sudo[189035]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:50.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:50 compute-1 sudo[189187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-shpcpxatgcwdlqokfpvftkxtqmsjnqtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162810.051862-2305-281331431970054/AnsiballZ_stat.py'
Jan 23 10:06:50 compute-1 sudo[189187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:50 compute-1 python3.9[189189]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:50 compute-1 sudo[189187]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:50 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:06:50 compute-1 ceph-mon[80126]: pgmap v399: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:06:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:06:50 compute-1 sudo[189310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ospdqslmawuuboayzruxnlabmxfrmgct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162810.051862-2305-281331431970054/AnsiballZ_copy.py'
Jan 23 10:06:50 compute-1 sudo[189310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:51 compute-1 python3.9[189312]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162810.051862-2305-281331431970054/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:51 compute-1 sudo[189310]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:51 compute-1 sudo[189463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sygcgsobgftetksmxwomfghjxzkpulbc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162811.3475583-2305-227451832088885/AnsiballZ_stat.py'
Jan 23 10:06:51 compute-1 sudo[189463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:51 compute-1 python3.9[189465]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:51 compute-1 sudo[189463]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:51.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:52 compute-1 sudo[189586]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xgjznrxgzqtevxztvacgwkzudkmngqbx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162811.3475583-2305-227451832088885/AnsiballZ_copy.py'
Jan 23 10:06:52 compute-1 sudo[189586]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:52.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:06:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100652 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:06:52 compute-1 python3.9[189588]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162811.3475583-2305-227451832088885/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:52 compute-1 sudo[189586]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:52 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:52 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:52 compute-1 sudo[189740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lqbmkeiyereietrzbqgkcspdjehmphmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162812.599231-2305-173728037810526/AnsiballZ_stat.py'
Jan 23 10:06:52 compute-1 sudo[189740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:53 compute-1 python3.9[189742]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:53 compute-1 sudo[189740]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:53 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:53 compute-1 ceph-mon[80126]: pgmap v400: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:06:53 compute-1 sudo[189864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvwjasyqjgncbgdgeyuoucttrdtwcgms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162812.599231-2305-173728037810526/AnsiballZ_copy.py'
Jan 23 10:06:53 compute-1 sudo[189864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:53 compute-1 python3.9[189866]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162812.599231-2305-173728037810526/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:53 compute-1 sudo[189864]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:53 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:53.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:54 compute-1 sudo[190016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-apfhgmliawokmxrfaatrmiequefoucku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162813.8691988-2305-172959392001028/AnsiballZ_stat.py'
Jan 23 10:06:54 compute-1 sudo[190016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:54.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:54 compute-1 python3.9[190018]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:54 compute-1 sudo[190016]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:54 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:54 compute-1 sudo[190139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-aycykswhegsfrgjjkfbfqqjnjjsvwcgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162813.8691988-2305-172959392001028/AnsiballZ_copy.py'
Jan 23 10:06:54 compute-1 sudo[190139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:54 compute-1 python3.9[190141]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162813.8691988-2305-172959392001028/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:54 compute-1 sudo[190139]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:06:55.032 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:06:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:06:55.033 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:06:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:06:55.033 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:06:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:55 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:55 compute-1 sudo[190292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exymimuwgsnasszkkobkpfbzkfgsbpqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162815.1196125-2305-232542808964308/AnsiballZ_stat.py'
Jan 23 10:06:55 compute-1 sudo[190292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:55 compute-1 python3.9[190294]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:55 compute-1 sudo[190292]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:55 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:55 compute-1 ceph-mon[80126]: pgmap v401: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:06:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:55.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:55 compute-1 sudo[190415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lmjjforszmlidswlcxaabgikzywxhtfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162815.1196125-2305-232542808964308/AnsiballZ_copy.py'
Jan 23 10:06:55 compute-1 sudo[190415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:56 compute-1 sudo[190418]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:06:56 compute-1 sudo[190418]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:06:56 compute-1 sudo[190418]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:56 compute-1 python3.9[190417]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162815.1196125-2305-232542808964308/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:56 compute-1 sudo[190415]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:56.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:56 compute-1 sudo[190592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cghcisyhhspqrfdmbfticiiuzuvozvrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162816.33545-2305-12300031237612/AnsiballZ_stat.py'
Jan 23 10:06:56 compute-1 sudo[190592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:56 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:56 compute-1 python3.9[190594]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:56 compute-1 sudo[190592]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:06:57 compute-1 ceph-mon[80126]: pgmap v402: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:06:57 compute-1 sudo[190715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bvirrodxwlpjdghhwuvpjyozwvjvahir ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162816.33545-2305-12300031237612/AnsiballZ_copy.py'
Jan 23 10:06:57 compute-1 sudo[190715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:57 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:57 compute-1 python3.9[190717]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162816.33545-2305-12300031237612/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:57 compute-1 sudo[190715]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:57 compute-1 sudo[190882]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hyhtbuidchtxutbnhbvlinodniitrfbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162817.4661543-2305-20698561615825/AnsiballZ_stat.py'
Jan 23 10:06:57 compute-1 podman[190842]: 2026-01-23 10:06:57.774632371 +0000 UTC m=+0.065502208 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 10:06:57 compute-1 sudo[190882]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:57 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:06:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:57.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:57 compute-1 python3.9[190890]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:57 compute-1 sudo[190882]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:06:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:06:58.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:06:58 compute-1 sudo[191011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-esgqmletnzloopvvqmxmnhrfmhnqtefy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162817.4661543-2305-20698561615825/AnsiballZ_copy.py'
Jan 23 10:06:58 compute-1 sudo[191011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:58 compute-1 python3.9[191013]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162817.4661543-2305-20698561615825/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:58 compute-1 sudo[191011]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:58 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:59 compute-1 sudo[191163]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-plwjshkkktsyeucswivmlpacstbulnld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162818.7391837-2305-99990440193089/AnsiballZ_stat.py'
Jan 23 10:06:59 compute-1 sudo[191163]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:59 compute-1 python3.9[191165]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:06:59 compute-1 sudo[191163]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:06:59 compute-1 ceph-mon[80126]: pgmap v403: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:06:59 compute-1 sudo[191287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-owyxbgaxamjctmbstswmbeoxeqtfpnic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162818.7391837-2305-99990440193089/AnsiballZ_copy.py'
Jan 23 10:06:59 compute-1 sudo[191287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:06:59 compute-1 python3.9[191289]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162818.7391837-2305-99990440193089/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:06:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:06:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:06:59 compute-1 sudo[191287]: pam_unix(sudo:session): session closed for user root
Jan 23 10:06:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:06:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:06:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:06:59.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:00 compute-1 sudo[191366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:07:00 compute-1 sudo[191366]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:00 compute-1 sudo[191366]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:00 compute-1 sudo[191464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-makrwnxskulknbkojuiywxdqokpykmma ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162819.9215534-2305-269813669787520/AnsiballZ_stat.py'
Jan 23 10:07:00 compute-1 sudo[191464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:00.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:00 compute-1 python3.9[191466]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:00 compute-1 sudo[191464]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:00 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:00 compute-1 sudo[191587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kvuadzcgbnkclmvhyzpowqbvjeghdwqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162819.9215534-2305-269813669787520/AnsiballZ_copy.py'
Jan 23 10:07:00 compute-1 sudo[191587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:01 compute-1 python3.9[191589]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162819.9215534-2305-269813669787520/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:01 compute-1 sudo[191587]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:01 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:01 compute-1 ceph-mon[80126]: pgmap v404: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:07:01 compute-1 sudo[191740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uypdrjwjpntucuegalwimopqzpwjjxif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162821.2720916-2305-80890308419736/AnsiballZ_stat.py'
Jan 23 10:07:01 compute-1 sudo[191740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:01 compute-1 python3.9[191742]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:01 compute-1 sudo[191740]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:01 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:01.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:02 compute-1 sudo[191863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dousuultluwgfhlntqbiotqhclcvktmv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162821.2720916-2305-80890308419736/AnsiballZ_copy.py'
Jan 23 10:07:02 compute-1 sudo[191863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:02.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:02 compute-1 python3.9[191865]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162821.2720916-2305-80890308419736/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:02 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:07:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:02 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:07:02 compute-1 sudo[191863]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:02 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:03 compute-1 python3.9[192015]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                             ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:03 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f40016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:03 compute-1 ceph-mon[80126]: pgmap v405: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 511 B/s wr, 1 op/s
Jan 23 10:07:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:03 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:03.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:04 compute-1 sudo[192169]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ihridkukwjgpdgvordvemjeivokndxnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162823.5878215-2923-183812961493642/AnsiballZ_seboolean.py'
Jan 23 10:07:04 compute-1 sudo[192169]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:04 compute-1 python3.9[192171]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Jan 23 10:07:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:04.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:04 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:04 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:07:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:05 compute-1 ceph-mon[80126]: pgmap v406: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 511 B/s wr, 1 op/s
Jan 23 10:07:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:07:05 compute-1 sudo[192169]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:07:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:05.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:07:06 compute-1 sudo[192326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lpvnvpxyvscvfrywimxyyeawlewaairo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162825.7320569-2947-58160492835234/AnsiballZ_copy.py'
Jan 23 10:07:06 compute-1 dbus-broker-launch[774]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Jan 23 10:07:06 compute-1 sudo[192326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:06 compute-1 python3.9[192328]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:06 compute-1 sudo[192326]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:06.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:06 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:06 compute-1 ceph-mon[80126]: pgmap v407: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:07:07 compute-1 sudo[192478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lfisvsvbshhahseminiutoroxrtvsdmw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162826.3839571-2947-145851538537487/AnsiballZ_copy.py'
Jan 23 10:07:07 compute-1 sudo[192478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:07 compute-1 python3.9[192480]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:07 compute-1 sudo[192478]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:07:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:07.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:07 compute-1 sudo[192631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sdvqfyttesldpiabjjxpqwwikptvltqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162827.6126053-2947-87628800480604/AnsiballZ_copy.py'
Jan 23 10:07:07 compute-1 sudo[192631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:08 compute-1 python3.9[192633]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:08 compute-1 sudo[192631]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:07:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:08.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:07:08 compute-1 sudo[192783]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfacyzjukalfnonpwvtekonldnfwycdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162828.3010828-2947-179056439711689/AnsiballZ_copy.py'
Jan 23 10:07:08 compute-1 sudo[192783]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:08 compute-1 python3.9[192785]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:08 compute-1 sudo[192783]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:09 compute-1 sudo[192935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qayyppnwnkhyplcmgdeqpqefgisiyuyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162828.9414775-2947-133330164146778/AnsiballZ_copy.py'
Jan 23 10:07:09 compute-1 sudo[192935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:09 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:09 compute-1 python3.9[192937]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:09 compute-1 sudo[192935]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:09 compute-1 ceph-mon[80126]: pgmap v408: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:07:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:09 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:09.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:10 compute-1 sudo[193088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-emyhhtpbfvgpxokeizfqcvqrrdgfeegr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162829.8035393-3055-76910252813840/AnsiballZ_copy.py'
Jan 23 10:07:10 compute-1 sudo[193088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:10.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:10 compute-1 python3.9[193090]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:10 compute-1 sudo[193088]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:10 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:10 compute-1 sudo[193240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-crfxmgjjwgbkgvicbdrwaervnsrtgtkc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162830.4484985-3055-191525147033912/AnsiballZ_copy.py'
Jan 23 10:07:10 compute-1 sudo[193240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:10 compute-1 python3.9[193242]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:10 compute-1 sudo[193240]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:11 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:11 compute-1 sudo[193392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hfdbnplahrrtvpbsqipwprcdvbvqftla ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162831.0773559-3055-103092335489917/AnsiballZ_copy.py'
Jan 23 10:07:11 compute-1 sudo[193392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:11 compute-1 python3.9[193394]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:11 compute-1 sudo[193392]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:11 compute-1 ceph-mon[80126]: pgmap v409: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:07:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100711 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:07:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:11 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:11.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:11 compute-1 sudo[193545]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-udejacpuzrknavbgzcydinnkhaptpnii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162831.701637-3055-132303988010722/AnsiballZ_copy.py'
Jan 23 10:07:11 compute-1 sudo[193545]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:12 compute-1 python3.9[193547]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:12 compute-1 sudo[193545]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:12.270 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:12 compute-1 sudo[193697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-afkyukibsopzolmiixyuslbhzqilsfat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162832.3667755-3055-9117136190977/AnsiballZ_copy.py'
Jan 23 10:07:12 compute-1 sudo[193697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:12 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:12 compute-1 python3.9[193699]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:12 compute-1 ceph-mon[80126]: pgmap v410: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 1.1 KiB/s wr, 4 op/s
Jan 23 10:07:12 compute-1 sudo[193697]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.835200) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832835278, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 707, "num_deletes": 251, "total_data_size": 1525072, "memory_usage": 1546200, "flush_reason": "Manual Compaction"}
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832843985, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 986082, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17267, "largest_seqno": 17969, "table_properties": {"data_size": 982539, "index_size": 1387, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7958, "raw_average_key_size": 19, "raw_value_size": 975535, "raw_average_value_size": 2373, "num_data_blocks": 61, "num_entries": 411, "num_filter_entries": 411, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162786, "oldest_key_time": 1769162786, "file_creation_time": 1769162832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 8850 microseconds, and 4469 cpu microseconds.
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.844060) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 986082 bytes OK
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.844086) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847798) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847827) EVENT_LOG_v1 {"time_micros": 1769162832847819, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.847850) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 1521240, prev total WAL file size 1521240, number of live WAL files 2.
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.848867) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031303034' seq:72057594037927935, type:22 .. '7061786F730031323536' seq:0, type:0; will stop at (end)
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(962KB)], [30(12MB)]
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832848920, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 14160674, "oldest_snapshot_seqno": -1}
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 4877 keys, 11786079 bytes, temperature: kUnknown
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832929632, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 11786079, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11752713, "index_size": 20072, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12229, "raw_key_size": 123348, "raw_average_key_size": 25, "raw_value_size": 11663198, "raw_average_value_size": 2391, "num_data_blocks": 835, "num_entries": 4877, "num_filter_entries": 4877, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162832, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.929872) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 11786079 bytes
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.931549) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 175.3 rd, 145.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.6 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(26.3) write-amplify(12.0) OK, records in: 5392, records dropped: 515 output_compression: NoCompression
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.931567) EVENT_LOG_v1 {"time_micros": 1769162832931559, "job": 16, "event": "compaction_finished", "compaction_time_micros": 80778, "compaction_time_cpu_micros": 40950, "output_level": 6, "num_output_files": 1, "total_output_size": 11786079, "num_input_records": 5392, "num_output_records": 4877, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832931817, "job": 16, "event": "table_file_deletion", "file_number": 32}
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162832933971, "job": 16, "event": "table_file_deletion", "file_number": 30}
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.848709) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934010) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934014) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934021) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:12 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:07:12.934023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:07:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:13 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:13 compute-1 sudo[193849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bxpybjjqeuntsbczfjwtmamkidxdtnwr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162833.0338936-3163-96080368516671/AnsiballZ_systemd.py'
Jan 23 10:07:13 compute-1 sudo[193849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:13 compute-1 python3.9[193851]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:13 compute-1 systemd[1]: Reloading.
Jan 23 10:07:13 compute-1 systemd-rc-local-generator[193879]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:13 compute-1 systemd-sysv-generator[193882]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:13 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:13.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:14 compute-1 systemd[1]: Starting libvirt logging daemon socket...
Jan 23 10:07:14 compute-1 systemd[1]: Listening on libvirt logging daemon socket.
Jan 23 10:07:14 compute-1 systemd[1]: Starting libvirt logging daemon admin socket...
Jan 23 10:07:14 compute-1 systemd[1]: Listening on libvirt logging daemon admin socket.
Jan 23 10:07:14 compute-1 systemd[1]: Starting libvirt logging daemon...
Jan 23 10:07:14 compute-1 systemd[1]: Started libvirt logging daemon.
Jan 23 10:07:14 compute-1 sudo[193849]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:14.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100714 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:07:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:14 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:14 compute-1 sudo[194060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kstwuuqsppdtzcubmttafyaeretiyfbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162834.3525574-3163-55954309676961/AnsiballZ_systemd.py'
Jan 23 10:07:14 compute-1 sudo[194060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:14 compute-1 podman[194017]: 2026-01-23 10:07:14.704479712 +0000 UTC m=+0.100443515 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 10:07:14 compute-1 python3.9[194067]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:14 compute-1 systemd[1]: Reloading.
Jan 23 10:07:15 compute-1 systemd-rc-local-generator[194099]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:15 compute-1 systemd-sysv-generator[194104]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:15 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:15 compute-1 systemd[1]: Starting libvirt nodedev daemon socket...
Jan 23 10:07:15 compute-1 systemd[1]: Listening on libvirt nodedev daemon socket.
Jan 23 10:07:15 compute-1 systemd[1]: Starting libvirt nodedev daemon admin socket...
Jan 23 10:07:15 compute-1 ceph-mon[80126]: pgmap v411: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 597 B/s wr, 2 op/s
Jan 23 10:07:15 compute-1 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Jan 23 10:07:15 compute-1 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Jan 23 10:07:15 compute-1 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Jan 23 10:07:15 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 10:07:15 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 23 10:07:15 compute-1 sudo[194060]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:15 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:15 compute-1 sudo[194287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-awtijsktozdqfgizybqpnqkgnvvwunds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162835.5520403-3163-20924769932278/AnsiballZ_systemd.py'
Jan 23 10:07:15 compute-1 sudo[194287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:15.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:16 compute-1 python3.9[194289]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:16 compute-1 systemd[1]: Reloading.
Jan 23 10:07:16 compute-1 systemd-sysv-generator[194315]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:16 compute-1 systemd-rc-local-generator[194312]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:16.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:16 compute-1 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Jan 23 10:07:16 compute-1 systemd[1]: Starting libvirt proxy daemon admin socket...
Jan 23 10:07:16 compute-1 systemd[1]: Starting libvirt proxy daemon read-only socket...
Jan 23 10:07:16 compute-1 systemd[1]: Listening on libvirt proxy daemon admin socket.
Jan 23 10:07:16 compute-1 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Jan 23 10:07:16 compute-1 systemd[1]: Starting libvirt proxy daemon...
Jan 23 10:07:16 compute-1 systemd[1]: Started libvirt proxy daemon.
Jan 23 10:07:16 compute-1 sudo[194287]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:16 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:16 compute-1 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Jan 23 10:07:16 compute-1 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Jan 23 10:07:16 compute-1 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Jan 23 10:07:17 compute-1 sudo[194506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ycxocbbiwtklluubumrknbpquvpynoye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162836.7187636-3163-203445538641442/AnsiballZ_systemd.py'
Jan 23 10:07:17 compute-1 sudo[194506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:17 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:17 compute-1 python3.9[194508]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:17 compute-1 systemd[1]: Reloading.
Jan 23 10:07:17 compute-1 ceph-mon[80126]: pgmap v412: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.9 KiB/s rd, 597 B/s wr, 2 op/s
Jan 23 10:07:17 compute-1 systemd-rc-local-generator[194537]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:17 compute-1 systemd-sysv-generator[194543]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:17 compute-1 systemd[1]: Listening on libvirt locking daemon socket.
Jan 23 10:07:17 compute-1 systemd[1]: Starting libvirt QEMU daemon socket...
Jan 23 10:07:17 compute-1 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Jan 23 10:07:17 compute-1 systemd[1]: Starting Virtual Machine and Container Registration Service...
Jan 23 10:07:17 compute-1 systemd[1]: Listening on libvirt QEMU daemon socket.
Jan 23 10:07:17 compute-1 systemd[1]: Starting libvirt QEMU daemon admin socket...
Jan 23 10:07:17 compute-1 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Jan 23 10:07:17 compute-1 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Jan 23 10:07:17 compute-1 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Jan 23 10:07:17 compute-1 systemd[1]: Started Virtual Machine and Container Registration Service.
Jan 23 10:07:17 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 10:07:17 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 23 10:07:17 compute-1 sudo[194506]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:17 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:17.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:17 compute-1 setroubleshoot[194326]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4de0b908-b857-4837-917a-7201a6fb06a8
Jan 23 10:07:17 compute-1 setroubleshoot[194326]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 23 10:07:17 compute-1 setroubleshoot[194326]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 4de0b908-b857-4837-917a-7201a6fb06a8
Jan 23 10:07:17 compute-1 setroubleshoot[194326]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                  
                                                  *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                  
                                                  If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                  Then turn on full auditing to get path information about the offending file and generate the error again.
                                                  Do
                                                  
                                                  Turn on full auditing
                                                  # auditctl -w /etc/shadow -p w
                                                  Try to recreate AVC. Then execute
                                                  # ausearch -m avc -ts recent
                                                  If you see PATH record check ownership/permissions on file, and fix it,
                                                  otherwise report as a bugzilla.
                                                  
                                                  *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                  
                                                  If you believe that virtlogd should have the dac_read_search capability by default.
                                                  Then you should report this as a bug.
                                                  You can generate a local policy module to allow this access.
                                                  Do
                                                  allow this access for now by executing:
                                                  # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                  # semodule -X 300 -i my-virtlogd.pp
                                                  
Jan 23 10:07:18 compute-1 sudo[194725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fmgluxfynaaxhoaaaejstrholgplpudu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162837.9256794-3163-22441178748458/AnsiballZ_systemd.py'
Jan 23 10:07:18 compute-1 sudo[194725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:18.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:18 compute-1 python3.9[194727]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:07:18 compute-1 systemd[1]: Reloading.
Jan 23 10:07:18 compute-1 systemd-sysv-generator[194757]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:18 compute-1 systemd-rc-local-generator[194754]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:18 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:18 compute-1 systemd[1]: Starting libvirt secret daemon socket...
Jan 23 10:07:18 compute-1 systemd[1]: Listening on libvirt secret daemon socket.
Jan 23 10:07:18 compute-1 systemd[1]: Starting libvirt secret daemon admin socket...
Jan 23 10:07:18 compute-1 systemd[1]: Starting libvirt secret daemon read-only socket...
Jan 23 10:07:18 compute-1 systemd[1]: Listening on libvirt secret daemon admin socket.
Jan 23 10:07:18 compute-1 systemd[1]: Listening on libvirt secret daemon read-only socket.
Jan 23 10:07:18 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 23 10:07:18 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 23 10:07:18 compute-1 sudo[194725]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:19 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:19 compute-1 ceph-mon[80126]: pgmap v413: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:07:19 compute-1 sudo[194937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jutgpwosmuppajvnewymsjpozazehsoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162839.301323-3274-70458257964841/AnsiballZ_file.py'
Jan 23 10:07:19 compute-1 sudo[194937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:19 compute-1 python3.9[194939]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:19 compute-1 sudo[194937]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:19 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:19.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:20 compute-1 sudo[195023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:07:20 compute-1 sudo[195023]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:20 compute-1 sudo[195023]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:20 compute-1 sudo[195114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yvcedtjuhsnqbejbidecnjrzjatzrrol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162839.9854565-3298-98731580999884/AnsiballZ_find.py'
Jan 23 10:07:20 compute-1 sudo[195114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:20.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:20 compute-1 python3.9[195116]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 10:07:20 compute-1 sudo[195114]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:07:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:20 compute-1 sudo[195266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qxpjdssdoaoyakhmevapoigdaillnknc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162840.6465409-3322-240580048328995/AnsiballZ_command.py'
Jan 23 10:07:20 compute-1 sudo[195266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:21 compute-1 python3.9[195268]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                             echo ceph
                                             awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:21 compute-1 sudo[195266]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:21 compute-1 ceph-mon[80126]: pgmap v414: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:07:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:21 compute-1 python3.9[195423]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 10:07:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:21.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:22.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:22 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc004000 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:22 compute-1 python3.9[195573]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:23 compute-1 ceph-mon[80126]: pgmap v415: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:07:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:23 compute-1 python3.9[195694]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162842.3655317-3379-244738027878309/.source.xml follow=False _original_basename=secret.xml.j2 checksum=19688f6e42a741164eafec41a84b8e73a76d185a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:23 compute-1 sudo[195847]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqyjrarucirsscxtrutdiqoiesfuoqus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162843.587794-3424-227803084560615/AnsiballZ_command.py'
Jan 23 10:07:23 compute-1 sudo[195847]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:23.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:24 compute-1 python3.9[195849]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine f3005f84-239a-55b6-a948-8f1fb592b920
                                             virsh secret-define --file /tmp/secret.xml
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:24 compute-1 polkitd[43458]: Registered Authentication Agent for unix-process:195851:399858 (system bus name :1.1839 [pkttyagent --process 195851 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Jan 23 10:07:24 compute-1 polkitd[43458]: Unregistered Authentication Agent for unix-process:195851:399858 (system bus name :1.1839, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Jan 23 10:07:24 compute-1 polkitd[43458]: Registered Authentication Agent for unix-process:195850:399857 (system bus name :1.1840 [pkttyagent --process 195850 --notify-fd 5 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Jan 23 10:07:24 compute-1 polkitd[43458]: Unregistered Authentication Agent for unix-process:195850:399857 (system bus name :1.1840, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Jan 23 10:07:24 compute-1 sudo[195847]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:24.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:24 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:24 compute-1 python3.9[196011]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:25 compute-1 sudo[196162]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dhmlvfeyvtpqnjgabdabcokjzstuytff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162845.1257656-3472-194194758942566/AnsiballZ_command.py'
Jan 23 10:07:25 compute-1 ceph-mon[80126]: pgmap v416: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:25 compute-1 sudo[196162]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:25 compute-1 sudo[196162]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:25.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:26 compute-1 auditd[701]: Audit daemon rotating log files
Jan 23 10:07:26 compute-1 sudo[196315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kageqelpqbelvwptosjjtcyzytarhwiy ; FSID=f3005f84-239a-55b6-a948-8f1fb592b920 KEY=AQB8Q3NpAAAAABAATAj6yCl+1UaIO/yyy7nUXA== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162845.8167348-3496-87145597436125/AnsiballZ_command.py'
Jan 23 10:07:26 compute-1 sudo[196315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:26.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:26 compute-1 polkitd[43458]: Registered Authentication Agent for unix-process:196318:400076 (system bus name :1.1843 [pkttyagent --process 196318 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8)
Jan 23 10:07:26 compute-1 polkitd[43458]: Unregistered Authentication Agent for unix-process:196318:400076 (system bus name :1.1843, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale en_US.UTF-8) (disconnected from bus)
Jan 23 10:07:26 compute-1 sudo[196315]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:26 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:26 compute-1 sudo[196473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ssneijvngtyuvmmrfgzobxwvkwwygefy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162846.551497-3520-119366508179074/AnsiballZ_copy.py'
Jan 23 10:07:26 compute-1 sudo[196473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:27 compute-1 python3.9[196475]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:27 compute-1 sudo[196473]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:27 compute-1 ceph-mon[80126]: pgmap v417: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:27 compute-1 sudo[196626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ymkwqxwvysqeudwetlgdvavoindaupth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162847.2236483-3544-213809944158388/AnsiballZ_stat.py'
Jan 23 10:07:27 compute-1 sudo[196626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:27 compute-1 python3.9[196628]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:27 compute-1 sudo[196626]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:27.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:28 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Jan 23 10:07:28 compute-1 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Consumed 1.064s CPU time.
Jan 23 10:07:28 compute-1 sudo[196762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ygcoqcjhteplvligmisehzukhpqjmxpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162847.2236483-3544-213809944158388/AnsiballZ_copy.py'
Jan 23 10:07:28 compute-1 sudo[196762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:28 compute-1 podman[196723]: 2026-01-23 10:07:28.068672406 +0000 UTC m=+0.061427187 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 10:07:28 compute-1 systemd[1]: setroubleshootd.service: Deactivated successfully.
Jan 23 10:07:28 compute-1 python3.9[196770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162847.2236483-3544-213809944158388/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:28 compute-1 sudo[196762]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:28.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:28 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a3f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:28 compute-1 sudo[196920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kosfxhkakueihkphozmzfwwjetblwtmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162848.5941038-3592-217941661948406/AnsiballZ_file.py'
Jan 23 10:07:28 compute-1 sudo[196920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:29 compute-1 python3.9[196922]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:29 compute-1 sudo[196920]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:29 compute-1 sudo[197073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mrrihmvsjmskliulpagteanyhslbxbbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162849.268815-3616-50464144949080/AnsiballZ_stat.py'
Jan 23 10:07:29 compute-1 sudo[197073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:29 compute-1 ceph-mon[80126]: pgmap v418: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:29 compute-1 python3.9[197075]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:29 compute-1 sudo[197073]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:29.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:30 compute-1 sudo[197151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmdnhbusjavimusvcobvwvogpaxzirmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162849.268815-3616-50464144949080/AnsiballZ_file.py'
Jan 23 10:07:30 compute-1 sudo[197151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:30 compute-1 python3.9[197153]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:30 compute-1 sudo[197151]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:30.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:30 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:30 compute-1 sudo[197303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-arkahdpvzykvbcgceqvomuyxxdaovlyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162850.4250405-3652-217096781678719/AnsiballZ_stat.py'
Jan 23 10:07:30 compute-1 sudo[197303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:30 compute-1 python3.9[197305]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:30 compute-1 sudo[197303]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:31 compute-1 sudo[197381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-raphnlacvdimruhexwkxyeuixrxuyand ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162850.4250405-3652-217096781678719/AnsiballZ_file.py'
Jan 23 10:07:31 compute-1 sudo[197381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:31 compute-1 python3.9[197383]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.qm76jcti recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:31 compute-1 sudo[197381]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:31 compute-1 ceph-mon[80126]: pgmap v419: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:31 compute-1 sudo[197534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gefzopvduwxxzyebmmxoaiqpuahromka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162851.5122852-3688-127973012983401/AnsiballZ_stat.py'
Jan 23 10:07:31 compute-1 sudo[197534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:31.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:31 compute-1 python3.9[197536]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:32 compute-1 sudo[197534]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:32 compute-1 sudo[197612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-krbiqtfcltwpaenoebtpxpqnykgfjwve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162851.5122852-3688-127973012983401/AnsiballZ_file.py'
Jan 23 10:07:32 compute-1 sudo[197612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:32.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:32 compute-1 python3.9[197614]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:32 compute-1 sudo[197612]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:32 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:32 compute-1 sudo[197764]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ntwfakglqrxlkwpqzohdehkdfhntamua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162852.69691-3727-73942676419968/AnsiballZ_command.py'
Jan 23 10:07:32 compute-1 sudo[197764]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:33 compute-1 python3.9[197766]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:33 compute-1 sudo[197764]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:33 compute-1 ceph-mon[80126]: pgmap v420: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:07:33 compute-1 sudo[197918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xhlhbuivibdsshebcqhbmytabsgzjhlj ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769162853.353474-3751-259444500114980/AnsiballZ_edpm_nftables_from_files.py'
Jan 23 10:07:33 compute-1 sudo[197918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a430 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:33 compute-1 python3[197920]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Jan 23 10:07:33 compute-1 sudo[197918]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:07:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:33.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:07:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:34.302 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:34 compute-1 sudo[198070]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-huwfdpfyzqiudvulpslipvpxpyzwxfwa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162854.1250947-3775-57303365232726/AnsiballZ_stat.py'
Jan 23 10:07:34 compute-1 sudo[198070]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:34 compute-1 python3.9[198072]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:34 compute-1 sudo[198070]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:34 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:34 compute-1 ceph-mon[80126]: pgmap v421: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:34 compute-1 sudo[198148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gqsybpagtzjrcxnrwairlfleqabhgnii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162854.1250947-3775-57303365232726/AnsiballZ_file.py'
Jan 23 10:07:34 compute-1 sudo[198148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:35 compute-1 python3.9[198150]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:35 compute-1 sudo[198148]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:35 compute-1 sudo[198301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fcwmxgblkuycmexlysqgsywzyyyfbjiz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162855.2519152-3811-183755194409300/AnsiballZ_stat.py'
Jan 23 10:07:35 compute-1 sudo[198301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:07:35 compute-1 python3.9[198303]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:35 compute-1 sudo[198301]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f8002ad0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:35.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:36 compute-1 sudo[198426]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mtosphlacdqxduixuoerxctaibjobhdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162855.2519152-3811-183755194409300/AnsiballZ_copy.py'
Jan 23 10:07:36 compute-1 sudo[198426]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:36 compute-1 python3.9[198428]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162855.2519152-3811-183755194409300/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:36 compute-1 sudo[198426]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:36.305 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:36 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a450 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:36 compute-1 ceph-mon[80126]: pgmap v422: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:36 compute-1 sudo[198578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mshumkowflimbbodxtgdpfwtflqxhifw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162856.4777339-3856-226963999676535/AnsiballZ_stat.py'
Jan 23 10:07:36 compute-1 sudo[198578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:36 compute-1 python3.9[198580]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:36 compute-1 sudo[198578]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:37 compute-1 sudo[198656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-guvhlztqxhuqdaxnkglckxdzpqjpgjfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162856.4777339-3856-226963999676535/AnsiballZ_file.py'
Jan 23 10:07:37 compute-1 sudo[198656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:37 compute-1 python3.9[198658]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:37 compute-1 sudo[198656]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:37 compute-1 sudo[198809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tpdoqmccpwxukahyltvknejjjmcxmzid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162857.541982-3892-175005439618183/AnsiballZ_stat.py'
Jan 23 10:07:37 compute-1 sudo[198809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:37.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:38 compute-1 python3.9[198811]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:38 compute-1 sudo[198809]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:38 compute-1 sudo[198887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxvqzeypcotsgizrimqhmwrgjoxfzupr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162857.541982-3892-175005439618183/AnsiballZ_file.py'
Jan 23 10:07:38 compute-1 sudo[198887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:38.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:38 compute-1 python3.9[198889]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:38 compute-1 sudo[198887]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:38 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:39 compute-1 sudo[199039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-lcbpxzaelrdlirlnbwhpftbrlivlksjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162858.6528866-3928-19840934855459/AnsiballZ_stat.py'
Jan 23 10:07:39 compute-1 sudo[199039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:39 compute-1 python3.9[199041]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:39 compute-1 sudo[199039]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:39 compute-1 ceph-mon[80126]: pgmap v423: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:39 compute-1 sudo[199165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kpnzoqeztceyyahmbymppakskrfqslrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162858.6528866-3928-19840934855459/AnsiballZ_copy.py'
Jan 23 10:07:39 compute-1 sudo[199165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:39 compute-1 python3.9[199167]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769162858.6528866-3928-19840934855459/.source.nft follow=False _original_basename=ruleset.j2 checksum=ac3ce8ce2d33fa5fe0a79b0c811c97734ce43fa5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:39 compute-1 sudo[199165]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a470 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:39.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:40 compute-1 sudo[199267]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:07:40 compute-1 sudo[199267]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:40 compute-1 sudo[199267]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:40.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:40 compute-1 sudo[199342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-urorayoelchxzofzfzyfbxvyjesqxhks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162860.0112329-3973-37315893610347/AnsiballZ_file.py'
Jan 23 10:07:40 compute-1 sudo[199342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:40 compute-1 python3.9[199344]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:40 compute-1 sudo[199342]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:40 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:40 compute-1 sudo[199494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajrwuotmpxttsqddvttvpwpdtachiibz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162860.7109098-3997-197377967050450/AnsiballZ_command.py'
Jan 23 10:07:40 compute-1 sudo[199494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:41 compute-1 python3.9[199496]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:41 compute-1 sudo[199494]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:41 compute-1 ceph-mon[80126]: pgmap v424: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:41 compute-1 sudo[199650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jehbcdfmaytmobjkczriolfalgilcvbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162861.4146304-4021-166023905388243/AnsiballZ_blockinfile.py'
Jan 23 10:07:41 compute-1 sudo[199650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:41.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:42 compute-1 python3.9[199652]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                             include "/etc/nftables/edpm-chains.nft"
                                             include "/etc/nftables/edpm-rules.nft"
                                             include "/etc/nftables/edpm-jumps.nft"
                                              path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:42 compute-1 sudo[199650]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:42.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:42 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a490 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:42 compute-1 sudo[199802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zrzhpwsndcivwgroqjlpgyyyjryvsfek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162862.4657896-4048-209363923012826/AnsiballZ_command.py'
Jan 23 10:07:42 compute-1 sudo[199802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:42 compute-1 python3.9[199804]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:42 compute-1 sudo[199802]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002740 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:43 compute-1 ceph-mon[80126]: pgmap v425: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:07:43 compute-1 sudo[199956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-djzxwcuhcfdrhgqgpmhnramiajoceyok ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162863.1750693-4072-194918656329196/AnsiballZ_stat.py'
Jan 23 10:07:43 compute-1 sudo[199956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:43 compute-1 python3.9[199958]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:07:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:43.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:44 compute-1 sudo[199956]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:44.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:44 compute-1 sudo[200110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wnulgwztlwobknaczutuxgbpvmpegyzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162864.2048473-4096-4103178505217/AnsiballZ_command.py'
Jan 23 10:07:44 compute-1 sudo[200110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:44 compute-1 python3.9[200112]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:07:44 compute-1 sudo[200110]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:44 compute-1 ceph-mon[80126]: pgmap v426: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:44 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:45 compute-1 sudo[200278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ryvpypumsdrzivrhqsmflwzrpfbqapwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162864.8724692-4120-182928852852093/AnsiballZ_file.py'
Jan 23 10:07:45 compute-1 sudo[200278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:45 compute-1 podman[200239]: 2026-01-23 10:07:45.252740273 +0000 UTC m=+0.145807207 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 10:07:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a4b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:45 compute-1 python3.9[200286]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:45 compute-1 sudo[200278]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:45 compute-1 sudo[200444]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hvjziabzcrawdugdyhvynzdbhwltofas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162865.5692914-4144-100914400769334/AnsiballZ_stat.py'
Jan 23 10:07:45 compute-1 sudo[200444]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:07:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:45.976 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:07:46 compute-1 python3.9[200446]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:46 compute-1 sudo[200444]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:07:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:46.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:07:46 compute-1 sudo[200567]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hlkiswyniynhskspgwcvypyxknqrvuks ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162865.5692914-4144-100914400769334/AnsiballZ_copy.py'
Jan 23 10:07:46 compute-1 sudo[200567]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:46 compute-1 python3.9[200569]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162865.5692914-4144-100914400769334/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:46 compute-1 sudo[200567]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:46 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:47 compute-1 sudo[200719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-izvpywallaokukmzdfiqwpltdmuimfgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162866.7798343-4189-11565546158488/AnsiballZ_stat.py'
Jan 23 10:07:47 compute-1 sudo[200719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:47 compute-1 python3.9[200721]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:47 compute-1 sudo[200719]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:47 compute-1 ceph-mon[80126]: pgmap v427: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:47 compute-1 sudo[200843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-adrslwdsdshdymobcijwhnrgbmilljhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162866.7798343-4189-11565546158488/AnsiballZ_copy.py'
Jan 23 10:07:47 compute-1 sudo[200843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a4d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:47 compute-1 python3.9[200845]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162866.7798343-4189-11565546158488/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:47 compute-1 sudo[200843]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:47.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:48 compute-1 sudo[200995]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xcjvffvstwgyngwlqipftqsfeijkyrsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162868.0540428-4234-180452146909196/AnsiballZ_stat.py'
Jan 23 10:07:48 compute-1 sudo[200995]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:48.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:48 compute-1 python3.9[200997]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:07:48 compute-1 sudo[200995]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:48 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:48 compute-1 sudo[201118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wzlvduysnfnaiazwzjkznylnzpvxtkiy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162868.0540428-4234-180452146909196/AnsiballZ_copy.py'
Jan 23 10:07:48 compute-1 sudo[201118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:49 compute-1 python3.9[201120]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162868.0540428-4234-180452146909196/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:07:49 compute-1 sudo[201118]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:49 compute-1 ceph-mon[80126]: pgmap v428: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:49 compute-1 sudo[201271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zkwqebqrvkyuecwoijlnspuevwporhwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162869.3475292-4279-92861142682083/AnsiballZ_systemd.py'
Jan 23 10:07:49 compute-1 sudo[201271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:49 compute-1 python3.9[201273]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:07:49 compute-1 systemd[1]: Reloading.
Jan 23 10:07:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:49.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:50 compute-1 systemd-rc-local-generator[201300]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:50 compute-1 systemd-sysv-generator[201303]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:50.331 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:50 compute-1 systemd[1]: Reached target edpm_libvirt.target.
Jan 23 10:07:50 compute-1 sudo[201271]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:50 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a4f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:50 compute-1 sudo[201461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-soykxtywqpiwxhzbbbuseqnkzmxzszhy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162870.5818708-4303-107618605434382/AnsiballZ_systemd.py'
Jan 23 10:07:50 compute-1 sudo[201461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:07:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:07:51 compute-1 python3.9[201463]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Jan 23 10:07:51 compute-1 systemd[1]: Reloading.
Jan 23 10:07:51 compute-1 systemd-rc-local-generator[201495]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:51 compute-1 systemd-sysv-generator[201498]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:51 compute-1 systemd[1]: Reloading.
Jan 23 10:07:51 compute-1 systemd-rc-local-generator[201531]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:07:51 compute-1 systemd-sysv-generator[201535]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:07:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:51 compute-1 sudo[201461]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:51 compute-1 ceph-mon[80126]: pgmap v429: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:51.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:52 compute-1 sshd-session[143725]: Connection closed by 192.168.122.30 port 48434
Jan 23 10:07:52 compute-1 sshd-session[143722]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:07:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:52.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:52 compute-1 systemd[1]: session-52.scope: Deactivated successfully.
Jan 23 10:07:52 compute-1 systemd[1]: session-52.scope: Consumed 3min 39.595s CPU time.
Jan 23 10:07:52 compute-1 systemd-logind[807]: Session 52 logged out. Waiting for processes to exit.
Jan 23 10:07:52 compute-1 systemd-logind[807]: Removed session 52.
Jan 23 10:07:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:52 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:52 compute-1 ceph-mon[80126]: pgmap v430: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:07:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:53 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a1400a510 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:53 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18004410 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:53.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:07:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:54.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:07:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:54 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:07:55.034 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:07:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:07:55.034 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:07:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:07:55.035 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:07:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:55 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:55 compute-1 ceph-mon[80126]: pgmap v431: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100755 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:07:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:55 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:55.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:56 compute-1 sudo[201566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:07:56 compute-1 sudo[201566]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:56 compute-1 sudo[201566]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:56.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:56 compute-1 sudo[201591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:07:56 compute-1 sudo[201591]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:07:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:56 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:56 compute-1 sudo[201591]: pam_unix(sudo:session): session closed for user root
Jan 23 10:07:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:57 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:57 compute-1 ceph-mon[80126]: pgmap v432: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:07:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:07:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:07:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:07:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:07:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:07:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:07:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:57 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:57 compute-1 sshd-session[201647]: Accepted publickey for zuul from 192.168.122.30 port 57496 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:07:57 compute-1 systemd-logind[807]: New session 53 of user zuul.
Jan 23 10:07:57 compute-1 systemd[1]: Started Session 53 of User zuul.
Jan 23 10:07:57 compute-1 sshd-session[201647]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:07:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:07:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:57.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:07:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:07:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:07:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:07:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:07:58.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:07:58 compute-1 podman[201774]: 2026-01-23 10:07:58.66200035 +0000 UTC m=+0.061149178 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 10:07:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:58 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:58 compute-1 python3.9[201810]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:07:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:07:59 compute-1 ceph-mon[80126]: pgmap v433: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:07:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:07:59 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:07:59.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:00 compute-1 sudo[201975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:08:00 compute-1 sudo[201975]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:08:00 compute-1 sudo[201975]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:00 compute-1 python3.9[201974]: ansible-ansible.builtin.service_facts Invoked
Jan 23 10:08:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:00.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:00 compute-1 network[202016]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 10:08:00 compute-1 network[202017]: 'network-scripts' will be removed from distribution in near future.
Jan 23 10:08:00 compute-1 network[202018]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 10:08:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:00 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:01 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:01 compute-1 ceph-mon[80126]: pgmap v434: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:01 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:08:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:02.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:08:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:02.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:02 compute-1 sudo[202093]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:08:02 compute-1 sudo[202093]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:08:02 compute-1 sudo[202093]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:02 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:03 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:03 compute-1 ceph-mon[80126]: pgmap v435: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:08:03 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:08:03 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:08:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:03 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:04.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:04.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:04 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:04 compute-1 sudo[202315]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-svkwfwvgcitltptauwwsxgifhnrdqzsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162884.6180575-97-227388897639865/AnsiballZ_setup.py'
Jan 23 10:08:04 compute-1 sudo[202315]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:05 compute-1 ceph-mon[80126]: pgmap v436: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:08:05 compute-1 python3.9[202317]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Jan 23 10:08:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:05 compute-1 sudo[202315]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:05 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:08:05 compute-1 sudo[202400]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-njllwbdagyfknfmtdxinlxmjdjsvodvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162884.6180575-97-227388897639865/AnsiballZ_dnf.py'
Jan 23 10:08:05 compute-1 sudo[202400]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:06.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:08:06 compute-1 python3.9[202402]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:08:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:06.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:06 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc001bd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:07 compute-1 ceph-mon[80126]: pgmap v437: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:08:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:07 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:08.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:08.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:08:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:08 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:08:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:09 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:09 compute-1 ceph-mon[80126]: pgmap v438: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:08:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:09 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:10.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:10.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:10 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:10 compute-1 ceph-mon[80126]: pgmap v439: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:08:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:11 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:11 compute-1 sudo[202400]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:11 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:12.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:12.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:12 : epoch 69734810 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:08:12 compute-1 sudo[202556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcnklldxjkuvclowuhhhrsgddetydafw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162892.0831928-133-148821948947227/AnsiballZ_stat.py'
Jan 23 10:08:12 compute-1 sudo[202556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:12 compute-1 python3.9[202558]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:08:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:12 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:12 compute-1 sudo[202556]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:13 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:13 compute-1 sudo[202709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tvzdmtbjvouauqugljswsuedrdugtsbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162892.9614139-163-120878089183210/AnsiballZ_command.py'
Jan 23 10:08:13 compute-1 sudo[202709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:13 compute-1 python3.9[202711]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:13 compute-1 sudo[202709]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:13 compute-1 ceph-mon[80126]: pgmap v440: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:08:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:13 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec002b10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:14.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:14 compute-1 sudo[202862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-saflmcmmnnknanuzdkfrbynwxlxvzfvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162893.9808843-193-156648796957188/AnsiballZ_stat.py'
Jan 23 10:08:14 compute-1 sudo[202862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:14.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:14 compute-1 python3.9[202864]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:08:14 compute-1 sudo[202862]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:14 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:14 compute-1 ceph-mon[80126]: pgmap v441: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:08:14 compute-1 sudo[203014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zfyfeesagbfwlxqrbwdmvbvlojbdvtyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162894.6477304-217-105303475037245/AnsiballZ_command.py'
Jan 23 10:08:14 compute-1 sudo[203014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:15 compute-1 python3.9[203016]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:15 compute-1 sudo[203014]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:15 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003c90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:15 compute-1 sudo[203181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xdxdeccjoabtgdyklokrsomllzqvvvvl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162895.3735876-241-121184879443453/AnsiballZ_stat.py'
Jan 23 10:08:15 compute-1 sudo[203181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:15 compute-1 podman[203142]: 2026-01-23 10:08:15.6966666 +0000 UTC m=+0.102312639 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:08:15 compute-1 python3.9[203189]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:08:15 compute-1 sudo[203181]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:15 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:16.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:16 compute-1 sudo[203318]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-woudevfvgvafjcobmtjlnyroyouiyrvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162895.3735876-241-121184879443453/AnsiballZ_copy.py'
Jan 23 10:08:16 compute-1 sudo[203318]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:16.372 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:16 compute-1 python3.9[203320]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162895.3735876-241-121184879443453/.source.iscsi _original_basename=.qq4qc0ky follow=False checksum=a41d40f9dbaa7a1982953c824d01a61d8b3c4d3a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:16 compute-1 sudo[203318]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:16 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:17 compute-1 sudo[203470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-efshgyzkandreydiwrefpqcsilteybsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162896.7657864-286-185529842520277/AnsiballZ_file.py'
Jan 23 10:08:17 compute-1 sudo[203470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:17 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:17 compute-1 python3.9[203472]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:17 compute-1 sudo[203470]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:17 compute-1 ceph-mon[80126]: pgmap v442: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:08:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100817 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:08:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:17 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:17 compute-1 sudo[203623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-yreuxxdrvlbzaxfyxwycqwtehpbenroq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162897.5525675-310-170013867863742/AnsiballZ_lineinfile.py'
Jan 23 10:08:17 compute-1 sudo[203623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:18.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:18 compute-1 python3.9[203625]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:18 compute-1 sudo[203623]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:18.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:18 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:19 compute-1 sudo[203775]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qttmuxnnfayvhnnzzzyvtpzcqytdtteq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162898.5109959-337-247254673728836/AnsiballZ_systemd_service.py'
Jan 23 10:08:19 compute-1 sudo[203775]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:19 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:19 compute-1 python3.9[203777]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:08:19 compute-1 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Jan 23 10:08:19 compute-1 sudo[203775]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:19 compute-1 ceph-mon[80126]: pgmap v443: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:08:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:19 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:19 compute-1 sudo[203932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sbqoytzhozuvliuqrnyblztbjeqnkgdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162899.6599195-361-230767787511502/AnsiballZ_systemd_service.py'
Jan 23 10:08:19 compute-1 sudo[203932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:20.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:20 compute-1 python3.9[203934]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:08:20 compute-1 systemd[1]: Reloading.
Jan 23 10:08:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:20.386 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:20 compute-1 systemd-rc-local-generator[203991]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:20 compute-1 systemd-sysv-generator[203995]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:20 compute-1 sudo[203936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:08:20 compute-1 sudo[203936]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:08:20 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 10:08:20 compute-1 sudo[203936]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:20 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 23 10:08:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:20 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:20 compute-1 kernel: Loading iSCSI transport class v2.0-870.
Jan 23 10:08:20 compute-1 systemd[1]: Started Open-iSCSI.
Jan 23 10:08:20 compute-1 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Jan 23 10:08:20 compute-1 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Jan 23 10:08:20 compute-1 sudo[203932]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:08:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:21 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:21 compute-1 ceph-mon[80126]: pgmap v444: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:08:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:22.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:22 compute-1 python3.9[204160]: ansible-ansible.builtin.service_facts Invoked
Jan 23 10:08:22 compute-1 network[204177]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 10:08:22 compute-1 network[204178]: 'network-scripts' will be removed from distribution in near future.
Jan 23 10:08:22 compute-1 network[204179]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 10:08:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:22.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:22 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003cf0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:23 compute-1 ceph-mon[80126]: pgmap v445: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:08:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:23 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:24.037 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:24.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:24 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89fc003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:25 compute-1 ceph-mon[80126]: pgmap v446: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:08:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f4003d10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:25 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:26.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:26.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:26 compute-1 sudo[204452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-koyyxetkjecxhbgbvzbeqiqqxirhhbmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162906.3726337-430-93267094200246/AnsiballZ_dnf.py'
Jan 23 10:08:26 compute-1 sudo[204452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:26 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:26 compute-1 python3.9[204454]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:08:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:27 compute-1 ceph-mon[80126]: pgmap v447: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:08:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:27 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:28.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:08:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:28.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:08:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:28 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:28 compute-1 podman[204461]: 2026-01-23 10:08:28.834334576 +0000 UTC m=+0.062322125 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 10:08:29 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 10:08:29 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 23 10:08:29 compute-1 systemd[1]: Reloading.
Jan 23 10:08:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:29 compute-1 systemd-sysv-generator[204525]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:29 compute-1 systemd-rc-local-generator[204522]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:29 compute-1 ceph-mon[80126]: pgmap v448: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:29 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 10:08:29 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 10:08:29 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 23 10:08:29 compute-1 systemd[1]: run-rf5d332351af540ce86f4f3b8d1944ec1.service: Deactivated successfully.
Jan 23 10:08:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:29 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:30.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:30 compute-1 sudo[204452]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:30.402 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:30 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a18002170 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:31 compute-1 sudo[204791]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tuvcpdmsvegdpeuptpocqlolqqktbzgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162911.1003187-457-21204122175629/AnsiballZ_file.py'
Jan 23 10:08:31 compute-1 sudo[204791]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:31 compute-1 python3.9[204794]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 10:08:31 compute-1 sudo[204791]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:31 compute-1 ceph-mon[80126]: pgmap v449: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:31 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:32.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:32 compute-1 sudo[204944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jodbewdrnqkznxlyexhswksyqfivonlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162911.7849152-481-109149322848568/AnsiballZ_modprobe.py'
Jan 23 10:08:32 compute-1 sudo[204944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:32 compute-1 python3.9[204946]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Jan 23 10:08:32 compute-1 sudo[204944]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:32.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:32 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:32 compute-1 ceph-mon[80126]: pgmap v450: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:32 compute-1 sudo[205100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ajfbxwotvcdhsnyprmabrwcaauexwsui ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162912.5913036-505-230516159021048/AnsiballZ_stat.py'
Jan 23 10:08:32 compute-1 sudo[205100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:33 compute-1 python3.9[205102]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:08:33 compute-1 sudo[205100]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:33 compute-1 sudo[205224]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uhbviietnjgknvnjqkbfzqtsedwkchez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162912.5913036-505-230516159021048/AnsiballZ_copy.py'
Jan 23 10:08:33 compute-1 sudo[205224]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:33 compute-1 python3.9[205226]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162912.5913036-505-230516159021048/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:33 compute-1 sudo[205224]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:33 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:34.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:34 compute-1 sudo[205376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nsedgdprccybsbuayyjarfrjydirkixc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162913.8721895-553-165327203381772/AnsiballZ_lineinfile.py'
Jan 23 10:08:34 compute-1 sudo[205376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:34 compute-1 python3.9[205378]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:34.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:34 compute-1 sudo[205376]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:34 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:35 compute-1 sudo[205528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mmposiqhwstwqmsrqpbkzfedrwdjmivy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162914.5781105-577-23217979644888/AnsiballZ_systemd.py'
Jan 23 10:08:35 compute-1 sudo[205528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180030e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:35 compute-1 python3.9[205530]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:08:35 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 10:08:35 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 23 10:08:35 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 23 10:08:35 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 23 10:08:35 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 23 10:08:35 compute-1 ceph-mon[80126]: pgmap v451: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:08:35 compute-1 sudo[205528]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:35 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14001320 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:36.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:36 compute-1 sudo[205685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iozxoikpxkmnhnvvmozjbroypeiorukv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162916.0432737-601-127590675522439/AnsiballZ_command.py'
Jan 23 10:08:36 compute-1 sudo[205685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:08:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:36.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:08:36 compute-1 python3.9[205687]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:36 compute-1 sudo[205685]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:36 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80038d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:37 compute-1 sudo[205838]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-elymgswgndmzgiynhrwlgfpunourgbwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162916.9387183-631-128546896510944/AnsiballZ_stat.py'
Jan 23 10:08:37 compute-1 sudo[205838]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:37 compute-1 python3.9[205840]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:08:37 compute-1 sudo[205838]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:37 compute-1 ceph-mon[80126]: pgmap v452: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:08:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:37 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180030e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:38 compute-1 sudo[205991]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-nssistbbgseziizispkobmbvlynivhjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162917.7334728-658-89372904624181/AnsiballZ_stat.py'
Jan 23 10:08:38 compute-1 sudo[205991]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000065s ======
Jan 23 10:08:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:38.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000065s
Jan 23 10:08:38 compute-1 python3.9[205993]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:08:38 compute-1 sudo[205991]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:08:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:38.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:08:38 compute-1 sudo[206114]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-acnescnywevuxmfsqamhahkkvoybqozu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162917.7334728-658-89372904624181/AnsiballZ_copy.py'
Jan 23 10:08:38 compute-1 sudo[206114]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:38 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:38 compute-1 python3.9[206116]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162917.7334728-658-89372904624181/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:38 compute-1 sudo[206114]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:38 compute-1 ceph-mon[80126]: pgmap v453: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:39 compute-1 sudo[206266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzxzkhfgsminyetyfburhgeccaxaqfqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162919.0607355-703-126466436168714/AnsiballZ_command.py'
Jan 23 10:08:39 compute-1 sudo[206266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:39 compute-1 python3.9[206268]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:39 compute-1 sudo[206266]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:39 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:40 compute-1 sudo[206420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bjlgidwwijaviiowpmlakfcvhabmqaeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162919.733067-727-86721522783368/AnsiballZ_lineinfile.py'
Jan 23 10:08:40 compute-1 sudo[206420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:40.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:40 compute-1 python3.9[206422]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:40 compute-1 sudo[206420]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:40.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:40 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180030e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:40 compute-1 sudo[206572]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xzsrcsssnruavtgdojyankhrwgyrizor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162920.422934-751-173782441296110/AnsiballZ_replace.py'
Jan 23 10:08:40 compute-1 sudo[206572]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:41 compute-1 sudo[206575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:08:41 compute-1 sudo[206575]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:08:41 compute-1 sudo[206575]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:41 compute-1 python3.9[206574]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:41 compute-1 sudo[206572]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:41 compute-1 ceph-mon[80126]: pgmap v454: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:41 compute-1 sudo[206750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tqrzyugftdvuydkuaasoncfeyuyvipoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162921.3058143-775-253984172022275/AnsiballZ_replace.py'
Jan 23 10:08:41 compute-1 sudo[206750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:41 compute-1 python3.9[206752]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:41 compute-1 sudo[206750]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:41 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:42.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:42.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:42 compute-1 sudo[206902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bwtbbzmisnoepqgfgniafydybocwgxto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162922.1462514-802-36322848426519/AnsiballZ_lineinfile.py'
Jan 23 10:08:42 compute-1 sudo[206902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:42 compute-1 python3.9[206904]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:42 compute-1 sudo[206902]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:42 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:43 compute-1 sudo[207055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckjnbfeqbuerufgtelljvdgiswhnfzng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162922.7876241-802-166528041200507/AnsiballZ_lineinfile.py'
Jan 23 10:08:43 compute-1 sudo[207055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:43 compute-1 python3.9[207057]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:43 compute-1 sudo[207055]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:43 compute-1 ceph-mon[80126]: pgmap v455: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:43 compute-1 sudo[207208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bfoexmhenfzemzpiklxucyebqjiqqsof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162923.423267-802-63980264373271/AnsiballZ_lineinfile.py'
Jan 23 10:08:43 compute-1 sudo[207208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:43 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:43 compute-1 python3.9[207210]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:43 compute-1 sudo[207208]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:44.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:44 compute-1 sudo[207360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ngwbjguucvzkgdjxralwpwmhcjloneav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162924.0606072-802-203188009704247/AnsiballZ_lineinfile.py'
Jan 23 10:08:44 compute-1 sudo[207360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:44.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:44 compute-1 python3.9[207362]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:44 compute-1 sudo[207360]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:44 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:45 compute-1 sudo[207512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-upvbggqnmkzxpektjvxpxsrujqlrfpml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162924.8026078-889-249673605756894/AnsiballZ_stat.py'
Jan 23 10:08:45 compute-1 sudo[207512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:45 compute-1 python3.9[207514]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:08:45 compute-1 sudo[207512]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:45 compute-1 sudo[207667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-parnannuocigfjdryvobvdbmgxakvyeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162925.4719899-913-174596639274115/AnsiballZ_command.py'
Jan 23 10:08:45 compute-1 sudo[207667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:45 compute-1 ceph-mon[80126]: pgmap v456: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:45 compute-1 podman[207669]: 2026-01-23 10:08:45.854567381 +0000 UTC m=+0.099261681 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller)
Jan 23 10:08:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:45 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:45 compute-1 python3.9[207670]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:08:45 compute-1 sudo[207667]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:46.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:46.426 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:46 compute-1 sudo[207846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vuuzfjrbxrnhxtdirolhjitwuehelgrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162926.2763996-940-195327417334951/AnsiballZ_systemd_service.py'
Jan 23 10:08:46 compute-1 sudo[207846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:46 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:46 compute-1 python3.9[207848]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:08:46 compute-1 systemd[1]: Listening on multipathd control socket.
Jan 23 10:08:47 compute-1 sudo[207846]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:47 compute-1 sudo[208003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rljntdgvbunceibotzbaolzykmwpyspb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162927.1972332-964-157772460149119/AnsiballZ_systemd_service.py'
Jan 23 10:08:47 compute-1 sudo[208003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:47 compute-1 ceph-mon[80126]: pgmap v457: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:08:47 compute-1 python3.9[208005]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:08:47 compute-1 systemd[1]: Starting Wait for udev To Complete Device Initialization...
Jan 23 10:08:47 compute-1 udevadm[208010]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Jan 23 10:08:47 compute-1 systemd[1]: Finished Wait for udev To Complete Device Initialization.
Jan 23 10:08:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:47 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:47 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 10:08:47 compute-1 multipathd[208013]: --------start up--------
Jan 23 10:08:47 compute-1 multipathd[208013]: read /etc/multipath.conf
Jan 23 10:08:47 compute-1 multipathd[208013]: path checkers start up
Jan 23 10:08:47 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 10:08:47 compute-1 sudo[208003]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:48.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:48.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:48 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:48 compute-1 ceph-mon[80126]: pgmap v458: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:48 compute-1 sudo[208170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-myqjguxhcckypkiglaufmsomomuhnhlv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162928.4931235-1000-13664856930843/AnsiballZ_file.py'
Jan 23 10:08:48 compute-1 sudo[208170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:48 compute-1 python3.9[208172]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Jan 23 10:08:49 compute-1 sudo[208170]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:49 compute-1 sudo[208323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-veiotgprwtllrjjoqtffpgmxinzzgobj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162929.1839566-1024-139852436132249/AnsiballZ_modprobe.py'
Jan 23 10:08:49 compute-1 sudo[208323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:49 compute-1 python3.9[208325]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Jan 23 10:08:49 compute-1 kernel: Key type psk registered
Jan 23 10:08:49 compute-1 sudo[208323]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:49 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:50.080 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:50 compute-1 sudo[208487]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hducwxgvurghfozidoywhtypwnhrsjto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162929.9559839-1048-163712510742347/AnsiballZ_stat.py'
Jan 23 10:08:50 compute-1 sudo[208487]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:08:50 compute-1 sshd-session[208384]: Invalid user sol from 45.148.10.240 port 41448
Jan 23 10:08:50 compute-1 python3.9[208489]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:08:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:50.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:50 compute-1 sudo[208487]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:50 compute-1 sshd-session[208384]: Connection closed by invalid user sol 45.148.10.240 port 41448 [preauth]
Jan 23 10:08:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:50 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89ec003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:50 compute-1 sudo[208610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tcxksjrxekzdbqyregdnbbbuifvpezft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162929.9559839-1048-163712510742347/AnsiballZ_copy.py'
Jan 23 10:08:50 compute-1 sudo[208610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:50 compute-1 python3.9[208612]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769162929.9559839-1048-163712510742347/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:51 compute-1 sudo[208610]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:51 compute-1 ceph-mon[80126]: pgmap v459: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a180044e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:51 compute-1 sudo[208763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qvixwehlcsslhvypjcnlwxtrgkdzmell ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162931.2605968-1096-231740502585006/AnsiballZ_lineinfile.py'
Jan 23 10:08:51 compute-1 sudo[208763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:51 compute-1 python3.9[208765]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:08:51 compute-1 sudo[208763]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:51 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f8a14002010 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:08:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:52.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:52 compute-1 sudo[208915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ilipqhxengatixioajzaudlscdhqlnxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162931.94666-1120-45746723534964/AnsiballZ_systemd.py'
Jan 23 10:08:52 compute-1 sudo[208915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:52.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:52 compute-1 python3.9[208917]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:08:52 compute-1 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Jan 23 10:08:52 compute-1 systemd[1]: Stopped Load Kernel Modules.
Jan 23 10:08:52 compute-1 systemd[1]: Stopping Load Kernel Modules...
Jan 23 10:08:52 compute-1 systemd[1]: Starting Load Kernel Modules...
Jan 23 10:08:52 compute-1 systemd[1]: Finished Load Kernel Modules.
Jan 23 10:08:52 compute-1 sudo[208915]: pam_unix(sudo:session): session closed for user root
Jan 23 10:08:52 compute-1 kernel: ganesha.nfsd[195820]: segfault at 50 ip 00007f8a9d96c32e sp 00007f8a097f9210 error 4 in libntirpc.so.5.8[7f8a9d951000+2c000] likely on CPU 5 (core 0, socket 5)
Jan 23 10:08:52 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:08:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[180667]: 23/01/2026 10:08:52 : epoch 69734810 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f89f80045e0 fd 38 proxy ignored for local
Jan 23 10:08:52 compute-1 systemd[1]: Started Process Core Dump (PID 208946/UID 0).
Jan 23 10:08:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:53 compute-1 sudo[209073]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-eovyxntegrjelmjxzkqmnmrdsesqffyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162932.8816912-1144-4877172810775/AnsiballZ_dnf.py'
Jan 23 10:08:53 compute-1 sudo[209073]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:08:53 compute-1 python3.9[209075]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Jan 23 10:08:53 compute-1 ceph-mon[80126]: pgmap v460: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:54.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:54 compute-1 systemd-coredump[208947]: Process 180681 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 59:
                                                    #0  0x00007f8a9d96c32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007f8a9d976900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:08:54 compute-1 systemd[1]: systemd-coredump@7-208946-0.service: Deactivated successfully.
Jan 23 10:08:54 compute-1 systemd[1]: systemd-coredump@7-208946-0.service: Consumed 1.349s CPU time.
Jan 23 10:08:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:54.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:54 compute-1 podman[209082]: 2026-01-23 10:08:54.469528798 +0000 UTC m=+0.053568157 container died 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, io.buildah.version=1.40.1, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.label-schema.license=GPLv2)
Jan 23 10:08:54 compute-1 systemd[1]: var-lib-containers-storage-overlay-c82d84bdbe6878d8b218ddda068aa529446032c3b46a70f0010b0c8d232df2bd-merged.mount: Deactivated successfully.
Jan 23 10:08:54 compute-1 ceph-mon[80126]: pgmap v461: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:08:55 compute-1 podman[209082]: 2026-01-23 10:08:55.002994911 +0000 UTC m=+0.587034250 container remove 3c55853e2d37b4c4a5abc92d7954ba8afff6304faeccb80710f09ce813e1c1da (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1)
Jan 23 10:08:55 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:08:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:08:55.035 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:08:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:08:55.036 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:08:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:08:55.036 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:08:55 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 10:08:55 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.015s CPU time.
Jan 23 10:08:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:56.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:08:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:56.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:08:56 compute-1 ceph-mon[80126]: pgmap v462: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:08:56 compute-1 systemd[1]: Reloading.
Jan 23 10:08:56 compute-1 systemd-rc-local-generator[209155]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:56 compute-1 systemd-sysv-generator[209161]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:57 compute-1 systemd[1]: Reloading.
Jan 23 10:08:57 compute-1 systemd-rc-local-generator[209194]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:57 compute-1 systemd-sysv-generator[209199]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:57 compute-1 systemd-logind[807]: Watching system buttons on /dev/input/event0 (Power Button)
Jan 23 10:08:57 compute-1 systemd-logind[807]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Jan 23 10:08:57 compute-1 lvm[209241]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 10:08:57 compute-1 lvm[209241]: VG ceph_vg0 finished
Jan 23 10:08:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:08:58 compute-1 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Jan 23 10:08:58 compute-1 systemd[1]: Starting man-db-cache-update.service...
Jan 23 10:08:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:08:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:08:58.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:08:58 compute-1 systemd[1]: Reloading.
Jan 23 10:08:58 compute-1 systemd-rc-local-generator[209293]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:08:58 compute-1 systemd-sysv-generator[209297]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:08:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:08:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:08:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:08:58.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:08:58 compute-1 systemd[1]: Queuing reload/restart jobs for marked units…
Jan 23 10:08:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100858 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:08:59 compute-1 podman[209303]: 2026-01-23 10:08:59.663486819 +0000 UTC m=+0.059137374 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:08:59 compute-1 ceph-mon[80126]: pgmap v463: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 10:09:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:00.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 10:09:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:00.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:00 compute-1 sudo[209073]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:01 compute-1 sudo[210562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:09:01 compute-1 sudo[210562]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:01 compute-1 sudo[210562]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:01 compute-1 sudo[210637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-icaieukxnmmcjzuqkjshqjqsvgoviach ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162940.891619-1168-8275705145715/AnsiballZ_systemd_service.py'
Jan 23 10:09:01 compute-1 sudo[210637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:01 compute-1 python3.9[210639]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:09:02 compute-1 ceph-mon[80126]: pgmap v464: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:02.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:02.455 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:02 compute-1 systemd[1]: Stopping Open-iSCSI...
Jan 23 10:09:02 compute-1 iscsid[204001]: iscsid shutting down.
Jan 23 10:09:02 compute-1 systemd[1]: iscsid.service: Deactivated successfully.
Jan 23 10:09:02 compute-1 systemd[1]: Stopped Open-iSCSI.
Jan 23 10:09:02 compute-1 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Jan 23 10:09:02 compute-1 systemd[1]: Starting Open-iSCSI...
Jan 23 10:09:02 compute-1 systemd[1]: Started Open-iSCSI.
Jan 23 10:09:02 compute-1 sudo[210637]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:02 compute-1 sudo[210669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:09:02 compute-1 sudo[210669]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:02 compute-1 sudo[210669]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:02 compute-1 sudo[210718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:09:02 compute-1 sudo[210718]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:02 compute-1 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Jan 23 10:09:02 compute-1 systemd[1]: Finished man-db-cache-update.service.
Jan 23 10:09:02 compute-1 systemd[1]: man-db-cache-update.service: Consumed 1.626s CPU time.
Jan 23 10:09:02 compute-1 systemd[1]: run-rf772bd38b33d4e6f8d324c902ddfade4.service: Deactivated successfully.
Jan 23 10:09:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:03 compute-1 ceph-mon[80126]: pgmap v465: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:03 compute-1 sudo[210861]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kggcgafowkqpnqjtifejasmgvzrzjtcz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162942.7930121-1192-215222829159181/AnsiballZ_systemd_service.py'
Jan 23 10:09:03 compute-1 sudo[210861]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:03 compute-1 sudo[210718]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:03 compute-1 python3.9[210863]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:09:03 compute-1 systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Jan 23 10:09:03 compute-1 multipathd[208013]: exit (signal)
Jan 23 10:09:03 compute-1 multipathd[208013]: --------shut down-------
Jan 23 10:09:03 compute-1 systemd[1]: multipathd.service: Deactivated successfully.
Jan 23 10:09:03 compute-1 systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Jan 23 10:09:03 compute-1 systemd[1]: Starting Device-Mapper Multipath Device Controller...
Jan 23 10:09:03 compute-1 multipathd[210884]: --------start up--------
Jan 23 10:09:03 compute-1 multipathd[210884]: read /etc/multipath.conf
Jan 23 10:09:03 compute-1 multipathd[210884]: path checkers start up
Jan 23 10:09:03 compute-1 systemd[1]: Started Device-Mapper Multipath Device Controller.
Jan 23 10:09:03 compute-1 sudo[210861]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:04.105 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:04.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:05 compute-1 python3.9[211041]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Jan 23 10:09:05 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 8.
Jan 23 10:09:05 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:09:05 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.015s CPU time.
Jan 23 10:09:05 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:09:05 compute-1 podman[211094]: 2026-01-23 10:09:05.602935554 +0000 UTC m=+0.054142005 container create f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325)
Jan 23 10:09:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1603e4fd7186137f2b1d3aaf85baad9e68881d714d19ff2b8a04631378ac54fb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:09:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1603e4fd7186137f2b1d3aaf85baad9e68881d714d19ff2b8a04631378ac54fb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:09:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1603e4fd7186137f2b1d3aaf85baad9e68881d714d19ff2b8a04631378ac54fb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:09:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1603e4fd7186137f2b1d3aaf85baad9e68881d714d19ff2b8a04631378ac54fb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:09:05 compute-1 podman[211094]: 2026-01-23 10:09:05.580583403 +0000 UTC m=+0.031789874 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:09:05 compute-1 podman[211094]: 2026-01-23 10:09:05.682040703 +0000 UTC m=+0.133247174 container init f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:09:05 compute-1 podman[211094]: 2026-01-23 10:09:05.688645344 +0000 UTC m=+0.139851795 container start f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, io.buildah.version=1.40.1, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:09:05 compute-1 bash[211094]: f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6
Jan 23 10:09:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:09:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:09:05 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:09:05 compute-1 ceph-mon[80126]: pgmap v466: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:09:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:09:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:09:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:09:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:09:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:09:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:09:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:09:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:09:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:09:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:09:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.929152) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945929625, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 1314, "num_deletes": 260, "total_data_size": 3217789, "memory_usage": 3265072, "flush_reason": "Manual Compaction"}
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945945963, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 2098879, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 17974, "largest_seqno": 19283, "table_properties": {"data_size": 2093326, "index_size": 2947, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11386, "raw_average_key_size": 18, "raw_value_size": 2082056, "raw_average_value_size": 3402, "num_data_blocks": 132, "num_entries": 612, "num_filter_entries": 612, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162834, "oldest_key_time": 1769162834, "file_creation_time": 1769162945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 16575 microseconds, and 7901 cpu microseconds.
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.946068) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 2098879 bytes OK
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.946107) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.948727) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.948764) EVENT_LOG_v1 {"time_micros": 1769162945948759, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.948786) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 3211535, prev total WAL file size 3211535, number of live WAL files 2.
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.950157) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0030' seq:72057594037927935, type:22 .. '6C6F676D00323536' seq:0, type:0; will stop at (end)
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(2049KB)], [33(11MB)]
Jan 23 10:09:05 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162945950245, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 13884958, "oldest_snapshot_seqno": -1}
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 4955 keys, 13427756 bytes, temperature: kUnknown
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946050859, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 13427756, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13392837, "index_size": 21433, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12421, "raw_key_size": 126107, "raw_average_key_size": 25, "raw_value_size": 13300862, "raw_average_value_size": 2684, "num_data_blocks": 881, "num_entries": 4955, "num_filter_entries": 4955, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769162945, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.051572) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 13427756 bytes
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.053895) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 137.7 rd, 133.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 11.2 +0.0 blob) out(12.8 +0.0 blob), read-write-amplify(13.0) write-amplify(6.4) OK, records in: 5489, records dropped: 534 output_compression: NoCompression
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.053932) EVENT_LOG_v1 {"time_micros": 1769162946053918, "job": 18, "event": "compaction_finished", "compaction_time_micros": 100857, "compaction_time_cpu_micros": 30813, "output_level": 6, "num_output_files": 1, "total_output_size": 13427756, "num_input_records": 5489, "num_output_records": 4955, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946054801, "job": 18, "event": "table_file_deletion", "file_number": 35}
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769162946058389, "job": 18, "event": "table_file_deletion", "file_number": 33}
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:05.950023) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058574) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058583) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058584) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058586) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:09:06.058587) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:09:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:06.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:09:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:06.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:09:06 compute-1 sudo[211301]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iimcsziyhgqzvknkpeiuoiiemjwmqkta ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162945.9045527-1244-170643366065214/AnsiballZ_file.py'
Jan 23 10:09:06 compute-1 sudo[211301]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:06 compute-1 python3.9[211303]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:06 compute-1 sudo[211301]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:06 compute-1 ceph-mon[80126]: pgmap v467: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 853 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:09:07 compute-1 sudo[211454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cqyfuurqyqvdkxgxeovpcavtgbuemdfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162947.6614184-1277-89737270934629/AnsiballZ_systemd_service.py'
Jan 23 10:09:07 compute-1 sudo[211454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:08.110 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:08 compute-1 python3.9[211456]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:09:08 compute-1 systemd[1]: Reloading.
Jan 23 10:09:08 compute-1 systemd-sysv-generator[211488]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:09:08 compute-1 systemd-rc-local-generator[211483]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:09:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:08.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:08 compute-1 sudo[211454]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:09 compute-1 python3.9[211641]: ansible-ansible.builtin.service_facts Invoked
Jan 23 10:09:09 compute-1 network[211658]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Jan 23 10:09:09 compute-1 network[211659]: 'network-scripts' will be removed from distribution in near future.
Jan 23 10:09:09 compute-1 network[211660]: It is advised to switch to 'NetworkManager' instead for network management.
Jan 23 10:09:09 compute-1 ceph-mon[80126]: pgmap v468: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:09:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:09:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:10.113 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:09:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:10.465 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:11 compute-1 sudo[211710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:09:11 compute-1 sudo[211710]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:11 compute-1 sudo[211710]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:11 compute-1 ceph-mon[80126]: pgmap v469: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:09:11 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:11 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:09:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:09:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:09:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:12.116 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:12.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:13 compute-1 ceph-mon[80126]: pgmap v470: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:09:13 compute-1 sudo[211959]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qspugsqkixrqxnuiqcxneowledwubtpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162953.4303384-1334-2359377630529/AnsiballZ_systemd_service.py'
Jan 23 10:09:13 compute-1 sudo[211959]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:14 compute-1 python3.9[211961]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:14 compute-1 sudo[211959]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:14.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:14.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:14 compute-1 sudo[212112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-amjmpldovweeluhbkdnvryxfoapwnbze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162954.2226233-1334-86926731284423/AnsiballZ_systemd_service.py'
Jan 23 10:09:14 compute-1 sudo[212112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:14 compute-1 python3.9[212114]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:14 compute-1 sudo[212112]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:15 compute-1 sudo[212265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-zzectsxluykuscdtgbfpiepyejieqept ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162954.9582088-1334-23410765261769/AnsiballZ_systemd_service.py'
Jan 23 10:09:15 compute-1 sudo[212265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:15 compute-1 systemd[1]: virtnodedevd.service: Deactivated successfully.
Jan 23 10:09:15 compute-1 python3.9[212267]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:15 compute-1 sudo[212265]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:15 compute-1 ceph-mon[80126]: pgmap v471: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 597 B/s wr, 1 op/s
Jan 23 10:09:15 compute-1 sudo[212433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ljoiiocbdzczoebnzkxtuprhudbcbvpj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162955.689374-1334-38403317452698/AnsiballZ_systemd_service.py'
Jan 23 10:09:15 compute-1 sudo[212433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:15 compute-1 podman[212394]: 2026-01-23 10:09:15.99444116 +0000 UTC m=+0.086678431 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:09:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:16.122 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:16 compute-1 python3.9[212441]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:16 compute-1 sudo[212433]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:16.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:16 compute-1 systemd[1]: virtproxyd.service: Deactivated successfully.
Jan 23 10:09:16 compute-1 sudo[212600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xynwnywjlezqpelixkwafqwsdmugcnmc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162956.4476092-1334-50005140686846/AnsiballZ_systemd_service.py'
Jan 23 10:09:16 compute-1 sudo[212600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:17 compute-1 ceph-mon[80126]: pgmap v472: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.3 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:09:17 compute-1 python3.9[212602]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:17 compute-1 sudo[212600]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:17 compute-1 sudo[212754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vcyjpqlupnomhrnnlztdtrsityiopfwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162957.3119602-1334-137402509441835/AnsiballZ_systemd_service.py'
Jan 23 10:09:17 compute-1 sudo[212754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:09:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c94000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:17 compute-1 python3.9[212756]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:17 compute-1 sudo[212754]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:18.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:18 compute-1 sudo[212922]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-whqmndvkijmmrclkbxsbctrsbsjlccpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162958.110299-1334-116328521213877/AnsiballZ_systemd_service.py'
Jan 23 10:09:18 compute-1 sudo[212922]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:18.479 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:18 compute-1 python3.9[212924]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:18 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:18 compute-1 sudo[212922]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:19 compute-1 sudo[213075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iilmotyqvsweaeexiyscrlmfhpdyytrb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162958.8944519-1334-200286531385303/AnsiballZ_systemd_service.py'
Jan 23 10:09:19 compute-1 sudo[213075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:19 compute-1 ceph-mon[80126]: pgmap v473: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:09:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:19 compute-1 python3.9[213077]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:09:19 compute-1 sudo[213075]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:09:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:20.127 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:09:20 compute-1 sudo[213229]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vsmfsghqvgqokuecjgmzczztwtoirhzq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162960.0048032-1511-206990671284180/AnsiballZ_file.py'
Jan 23 10:09:20 compute-1 sudo[213229]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:20 compute-1 python3.9[213231]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:20.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:20 compute-1 sudo[213229]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:09:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/100920 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:09:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:20 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:20 compute-1 sudo[213381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xmmiuxhxjckrvtbwouebwhpleulikjvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162960.6178215-1511-197790069429284/AnsiballZ_file.py'
Jan 23 10:09:20 compute-1 sudo[213381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:21 compute-1 python3.9[213383]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:21 compute-1 sudo[213381]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:21 compute-1 sudo[213384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:09:21 compute-1 sudo[213384]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:21 compute-1 sudo[213384]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:21 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:21 compute-1 sudo[213559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cvaeidchmcemxwjrfzifxljxqfcgnnlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162961.235433-1511-129565479815949/AnsiballZ_file.py'
Jan 23 10:09:21 compute-1 sudo[213559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:21 compute-1 python3.9[213561]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:21 compute-1 ceph-mon[80126]: pgmap v474: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:09:21 compute-1 sudo[213559]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:21 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:22 compute-1 sudo[213711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rukjyzxjqatgrdibkojoolohtoqrzubz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162961.8434708-1511-28569392110573/AnsiballZ_file.py'
Jan 23 10:09:22 compute-1 sudo[213711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:22.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:22 compute-1 python3.9[213713]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:22 compute-1 sudo[213711]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:22.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:22 compute-1 sudo[213863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbbbfncvismwcrfbmmulyyjayygtgcev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162962.4441974-1511-81155648949931/AnsiballZ_file.py'
Jan 23 10:09:22 compute-1 sudo[213863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:22 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:22 compute-1 ceph-mon[80126]: pgmap v475: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:09:22 compute-1 python3.9[213865]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:22 compute-1 sudo[213863]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:23 compute-1 sudo[214015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fddjcyweqpowjxczcqibrjgezeqjfytu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162963.0716026-1511-71726052472967/AnsiballZ_file.py'
Jan 23 10:09:23 compute-1 sudo[214015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:23 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:23 compute-1 python3.9[214017]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:23 compute-1 sudo[214015]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:23 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:23 compute-1 sudo[214168]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qycgoroconftrjgbaibxprdipilfpncb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162963.7182856-1511-206517885706972/AnsiballZ_file.py'
Jan 23 10:09:23 compute-1 sudo[214168]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:24.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:24 compute-1 python3.9[214170]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:24 compute-1 sudo[214168]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:24.489 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:24 compute-1 sudo[214320]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ltdmvmjwrnqxytlvlnxlqnkayrbjbzid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162964.3294697-1511-194991350141553/AnsiballZ_file.py'
Jan 23 10:09:24 compute-1 sudo[214320]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:24 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:24 compute-1 python3.9[214322]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:24 compute-1 sudo[214320]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:25 compute-1 sudo[214472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-iakziaqjkigqkvfwwbvgfpfvdhlocvlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162965.0023525-1682-60985861059685/AnsiballZ_file.py'
Jan 23 10:09:25 compute-1 sudo[214472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:25 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880025c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:25 compute-1 python3.9[214474]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:25 compute-1 sudo[214472]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:25 compute-1 ceph-mon[80126]: pgmap v476: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:09:25 compute-1 sudo[214625]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-exnztowjwdrxscsastzxdqsxzggpnttd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162965.5859776-1682-12010435792249/AnsiballZ_file.py'
Jan 23 10:09:25 compute-1 sudo[214625]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:25 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:26 compute-1 python3.9[214627]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:26 compute-1 sudo[214625]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:26.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:26 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 10:09:26 compute-1 systemd[1]: virtqemud.service: Deactivated successfully.
Jan 23 10:09:26 compute-1 sudo[214779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xfkodafdlfkdxegutebnxetirdfjmfmh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162966.2474408-1682-120963770310632/AnsiballZ_file.py'
Jan 23 10:09:26 compute-1 sudo[214779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:26.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:26 compute-1 python3.9[214781]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:26 compute-1 sudo[214779]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:26 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:27 compute-1 sudo[214931]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ubrurhxdpjyaeqlejmvsfhdkqkzwnwrp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162966.8344295-1682-229945160370449/AnsiballZ_file.py'
Jan 23 10:09:27 compute-1 sudo[214931]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:27 compute-1 python3.9[214933]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:27 compute-1 sudo[214931]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:27 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:27 compute-1 sudo[215084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mktqztbgbltofdgfczihqmwbnnynisya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162967.421824-1682-110119675679640/AnsiballZ_file.py'
Jan 23 10:09:27 compute-1 sudo[215084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:27 compute-1 ceph-mon[80126]: pgmap v477: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:09:27 compute-1 python3.9[215086]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:27 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:27 compute-1 sudo[215084]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:28.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:28 compute-1 sudo[215236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kshrvqczfceitxrwytovelanvgkzhajc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162968.040866-1682-193786952491572/AnsiballZ_file.py'
Jan 23 10:09:28 compute-1 sudo[215236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:28 compute-1 python3.9[215238]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:09:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:28.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:09:28 compute-1 sudo[215236]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:28 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:28 compute-1 sudo[215388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-hqldzpkylpmgqpbtgtrjgrddzrxwjmql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162968.655772-1682-80588921186214/AnsiballZ_file.py'
Jan 23 10:09:28 compute-1 sudo[215388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:29 compute-1 python3.9[215390]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:29 compute-1 sudo[215388]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:29 compute-1 ceph-mon[80126]: pgmap v478: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:09:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:29 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:29 compute-1 sudo[215541]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vfrrmtxdykyfbrtesxulyjxjzeuzxdkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162969.2747648-1682-66636819309300/AnsiballZ_file.py'
Jan 23 10:09:29 compute-1 sudo[215541]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:29 compute-1 python3.9[215543]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:09:29 compute-1 sudo[215541]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:29 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:09:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:30.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:09:30 compute-1 sudo[215699]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-qcnerbrpgfodaogaikbvoodggimysury ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162970.2154722-1856-124222139969896/AnsiballZ_command.py'
Jan 23 10:09:30 compute-1 sudo[215699]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:30.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:30 compute-1 podman[215667]: 2026-01-23 10:09:30.513759434 +0000 UTC m=+0.063630556 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 10:09:30 compute-1 python3.9[215708]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                               systemctl disable --now certmonger.service
                                               test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                             fi
                                              _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:30 compute-1 sudo[215699]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:30 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880032d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:31 compute-1 ceph-mon[80126]: pgmap v479: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:09:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:31 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:31 compute-1 python3.9[215867]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Jan 23 10:09:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:31 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:32 compute-1 sudo[216018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vlxnhmbzqlckxjzskyudybcttrsuvipr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162971.822973-1910-174092447594240/AnsiballZ_systemd_service.py'
Jan 23 10:09:32 compute-1 sudo[216018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:32.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:32 compute-1 python3.9[216020]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:09:32 compute-1 systemd[1]: Reloading.
Jan 23 10:09:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:32.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:32 compute-1 systemd-rc-local-generator[216046]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:09:32 compute-1 systemd-sysv-generator[216051]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:09:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:32 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:32 compute-1 sudo[216018]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:33 compute-1 sudo[216205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ndaqhpzwkurpxhydpiteholeuqltuxgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162972.9970536-1934-60630445021752/AnsiballZ_command.py'
Jan 23 10:09:33 compute-1 sudo[216205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:33 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:33 compute-1 ceph-mon[80126]: pgmap v480: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:09:33 compute-1 python3.9[216207]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:33 compute-1 sudo[216205]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:33 compute-1 sudo[216359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xlnvwidfqucqcuvwlrcyoijimlpseiju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162973.6334295-1934-111700955451514/AnsiballZ_command.py'
Jan 23 10:09:33 compute-1 sudo[216359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:33 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:34 compute-1 python3.9[216361]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:34 compute-1 sudo[216359]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:34.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:34 compute-1 sudo[216512]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-clhqtgadaijkbijawlnrfnnekckxptyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162974.239775-1934-87831262078531/AnsiballZ_command.py'
Jan 23 10:09:34 compute-1 sudo[216512]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:34.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:34 compute-1 python3.9[216514]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:34 compute-1 sudo[216512]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:34 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:35 compute-1 sudo[216665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vgezuzasgbjvcdzxijedysohibfvnhej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162974.8226616-1934-183879166454011/AnsiballZ_command.py'
Jan 23 10:09:35 compute-1 sudo[216665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:35 compute-1 python3.9[216667]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:35 compute-1 sudo[216665]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:35 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:35 compute-1 ceph-mon[80126]: pgmap v481: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:09:35 compute-1 sudo[216819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wrvwkwocteimwgmclpjloqfcxsutapzi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162975.452614-1934-280798751232914/AnsiballZ_command.py'
Jan 23 10:09:35 compute-1 sudo[216819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:35 compute-1 python3.9[216821]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:35 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:35 compute-1 sudo[216819]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:36.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:36 compute-1 sudo[216972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-tjtxyuqncxquszdpayydzbcbueecgiyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162976.064291-1934-194032178944638/AnsiballZ_command.py'
Jan 23 10:09:36 compute-1 sudo[216972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:36 compute-1 python3.9[216974]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:36.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:36 compute-1 sudo[216972]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:36 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:36 compute-1 sudo[217125]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-jugkombrfycqxmzzligpedcfdoenditn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162976.6649945-1934-114420543617285/AnsiballZ_command.py'
Jan 23 10:09:36 compute-1 sudo[217125]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:37 compute-1 python3.9[217127]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:37 compute-1 sudo[217125]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:37 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:37 compute-1 sudo[217279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wicbyipuvwqnynpjzrcmxanksdmvbfah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162977.281361-1934-38172515757865/AnsiballZ_command.py'
Jan 23 10:09:37 compute-1 sudo[217279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:37 compute-1 ceph-mon[80126]: pgmap v482: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:09:37 compute-1 python3.9[217281]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Jan 23 10:09:37 compute-1 sudo[217279]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:37 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:38.155 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:38.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:38 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:38 compute-1 ceph-mon[80126]: pgmap v483: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:39 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:39 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:40.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:40.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:40 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:41 compute-1 sudo[217358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:09:41 compute-1 sudo[217358]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:09:41 compute-1 sudo[217358]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:41 compute-1 sudo[217459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ckkvfmoklzydaqfcdknlcyskpgndcztu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162981.1448524-2141-144705615157705/AnsiballZ_file.py'
Jan 23 10:09:41 compute-1 sudo[217459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:41 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:41 compute-1 ceph-mon[80126]: pgmap v484: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:41 compute-1 python3.9[217461]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:41 compute-1 sudo[217459]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:41 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:42.163 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:42 compute-1 sudo[217611]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-szcfayrjfdlalhqhrtzqktjhhkvivjzb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162981.7755551-2141-158216925704404/AnsiballZ_file.py'
Jan 23 10:09:42 compute-1 sudo[217611]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:42 compute-1 python3.9[217613]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:42.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:42 compute-1 sudo[217611]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:42 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:42 compute-1 sudo[217763]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ufgiaoguaceewlnooqbqyaamnspoxfvr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162982.6712914-2141-164755096876907/AnsiballZ_file.py'
Jan 23 10:09:42 compute-1 sudo[217763]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:43 compute-1 python3.9[217765]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:43 compute-1 sudo[217763]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:43 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:43 compute-1 sudo[217916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ohdhgohjwlslifqwteipfcminvfyftki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162983.384434-2207-134639574597111/AnsiballZ_file.py'
Jan 23 10:09:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:43 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:43 compute-1 sudo[217916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:44 compute-1 ceph-mon[80126]: pgmap v485: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:09:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:44.165 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:44 compute-1 python3.9[217918]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:44 compute-1 sudo[217916]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:44.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:44 compute-1 sudo[218068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-dbncgbswzrmwhyxnmgzucorbggexphbv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162984.436266-2207-236188637031841/AnsiballZ_file.py'
Jan 23 10:09:44 compute-1 sudo[218068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:44 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:44 compute-1 python3.9[218070]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:44 compute-1 sudo[218068]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:45 compute-1 ceph-mon[80126]: pgmap v486: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:45 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:45 compute-1 sudo[218221]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-gxldopjmksnjpoacbgfjtnbpkhuobwpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162985.0210676-2207-280967167512338/AnsiballZ_file.py'
Jan 23 10:09:45 compute-1 sudo[218221]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:45 compute-1 python3.9[218223]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:45 compute-1 sudo[218221]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:45 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:46.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:46 compute-1 sudo[218384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-enrwmtyjpigvpscsfrccmovbmbqnulgp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162985.9028232-2207-234887546804064/AnsiballZ_file.py'
Jan 23 10:09:46 compute-1 sudo[218384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:46 compute-1 podman[218347]: 2026-01-23 10:09:46.281719143 +0000 UTC m=+0.104603962 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 10:09:46 compute-1 python3.9[218390]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:46 compute-1 sudo[218384]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:46.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:46 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:46 compute-1 sudo[218549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypygibepyvxladsswmwqchcyxcqnxukg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162986.5809839-2207-210810091783205/AnsiballZ_file.py'
Jan 23 10:09:46 compute-1 sudo[218549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:47 compute-1 python3.9[218551]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:47 compute-1 sudo[218549]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:47 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:47 compute-1 sudo[218702]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-grjiawysoudgnfuwavcrkvozaofcjzyi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162987.2140427-2207-140129410886428/AnsiballZ_file.py'
Jan 23 10:09:47 compute-1 sudo[218702]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:47 compute-1 python3.9[218704]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:47 compute-1 sudo[218702]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:47 compute-1 ceph-mon[80126]: pgmap v487: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:09:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:47 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:48 compute-1 sudo[218854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vffzhmgultlhargolnbwlhrtfadrutnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162987.8093796-2207-251033395455880/AnsiballZ_file.py'
Jan 23 10:09:48 compute-1 sudo[218854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:48.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:48 compute-1 python3.9[218856]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:09:48 compute-1 sudo[218854]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:09:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:48.525 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:09:48 compute-1 ceph-mon[80126]: pgmap v488: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:48 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:49 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88003fe0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:49 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:09:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:50.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:50.528 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:50 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:51 compute-1 ceph-mon[80126]: pgmap v489: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:51 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:09:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Cumulative writes: 9060 writes, 35K keys, 9060 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 9060 writes, 1959 syncs, 4.62 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 781 writes, 1248 keys, 781 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s
                                           Interval WAL: 781 writes, 366 syncs, 2.13 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 10:09:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:51 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68000d90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:52.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:52.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:52 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c000d00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:53 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:53 compute-1 ceph-mon[80126]: pgmap v490: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:09:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:53 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:54.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:54.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:54 compute-1 sudo[219011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ifavdvypeaittbhkthoikdoyvpoodjop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162994.002806-2532-246964673770482/AnsiballZ_getent.py'
Jan 23 10:09:54 compute-1 sudo[219011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:54 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:54 compute-1 python3.9[219013]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Jan 23 10:09:54 compute-1 sudo[219011]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:09:55.036 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:09:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:09:55.037 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:09:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:09:55.037 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:09:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:55 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:55 compute-1 ceph-mon[80126]: pgmap v491: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:55 compute-1 sudo[219165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ypebjjkhyblbwryvtyabrjxjubtefffm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162995.1346312-2556-28486342734564/AnsiballZ_group.py'
Jan 23 10:09:55 compute-1 sudo[219165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:55 compute-1 python3.9[219167]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Jan 23 10:09:55 compute-1 groupadd[219168]: group added to /etc/group: name=nova, GID=42436
Jan 23 10:09:55 compute-1 groupadd[219168]: group added to /etc/gshadow: name=nova
Jan 23 10:09:55 compute-1 groupadd[219168]: new group: name=nova, GID=42436
Jan 23 10:09:55 compute-1 sudo[219165]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:55 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:56.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:09:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:56.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:09:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:56 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:56 compute-1 ceph-mon[80126]: pgmap v492: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:09:56 compute-1 sudo[219323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rsxdgbtprqzgkwxhamfhhjylvbmitqhv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769162996.498372-2580-2531702368286/AnsiballZ_user.py'
Jan 23 10:09:56 compute-1 sudo[219323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:09:57 compute-1 python3.9[219325]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-1 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Jan 23 10:09:57 compute-1 useradd[219327]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Jan 23 10:09:57 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:09:57 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:09:57 compute-1 useradd[219327]: add 'nova' to group 'libvirt'
Jan 23 10:09:57 compute-1 useradd[219327]: add 'nova' to shadow group 'libvirt'
Jan 23 10:09:57 compute-1 sudo[219323]: pam_unix(sudo:session): session closed for user root
Jan 23 10:09:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:09:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:09:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:09:58.191 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:09:58 compute-1 sshd-session[219360]: Accepted publickey for zuul from 192.168.122.30 port 50834 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:09:58 compute-1 systemd-logind[807]: New session 54 of user zuul.
Jan 23 10:09:58 compute-1 systemd[1]: Started Session 54 of User zuul.
Jan 23 10:09:58 compute-1 sshd-session[219360]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:09:58 compute-1 sshd-session[219363]: Received disconnect from 192.168.122.30 port 50834:11: disconnected by user
Jan 23 10:09:58 compute-1 sshd-session[219363]: Disconnected from user zuul 192.168.122.30 port 50834
Jan 23 10:09:58 compute-1 sshd-session[219360]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:09:58 compute-1 systemd[1]: session-54.scope: Deactivated successfully.
Jan 23 10:09:58 compute-1 systemd-logind[807]: Session 54 logged out. Waiting for processes to exit.
Jan 23 10:09:58 compute-1 systemd-logind[807]: Removed session 54.
Jan 23 10:09:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:09:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:09:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:09:58.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:09:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:58 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70002f50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:59 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:09:59 compute-1 python3.9[219513]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:09:59 compute-1 ceph-mon[80126]: pgmap v493: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:09:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:09:59 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680018b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:00 compute-1 python3.9[219635]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769162998.9841406-2655-224926547727298/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:00.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:00 compute-1 ceph-mon[80126]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:10:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:00.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:00 compute-1 podman[219759]: 2026-01-23 10:10:00.669451658 +0000 UTC m=+0.057738510 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:10:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:00 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c001820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:00 compute-1 python3.9[219796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:01 compute-1 python3.9[219878]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:01 compute-1 sudo[219904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:10:01 compute-1 sudo[219904]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:01 compute-1 sudo[219904]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:01 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:01 compute-1 ceph-mon[80126]: pgmap v494: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:01 compute-1 python3.9[220054]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:01 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:02.235 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:02 compute-1 python3.9[220175]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163001.4491875-2655-276195173207180/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:02.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:02 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:03 compute-1 python3.9[220325]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:03 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:03 compute-1 ceph-mon[80126]: pgmap v495: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:03 compute-1 python3.9[220447]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163002.58453-2655-13136816431452/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=bc7f3bb7d4094c596a18178a888511b54e157ba4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:03 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:10:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:04.239 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:10:04 compute-1 python3.9[220597]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:04.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:04 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:05 compute-1 python3.9[220718]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163004.04586-2655-226613656211426/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:05 compute-1 ceph-mon[80126]: pgmap v496: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:10:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:05 compute-1 python3.9[220869]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:06.242 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:06 compute-1 python3.9[220990]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163005.4865828-2655-171246921296239/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:06.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:06 compute-1 ceph-mon[80126]: pgmap v497: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:10:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:06 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:07 compute-1 sudo[221140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-isgkzowmardmrwworvussqobvwxxumtd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163006.812327-2904-235426420810724/AnsiballZ_file.py'
Jan 23 10:10:07 compute-1 sudo[221140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:07 compute-1 python3.9[221142]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:10:07 compute-1 sudo[221140]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:07 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:07 compute-1 sudo[221293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-cejkkmumtfcbfuiezuivsqyaqooowpup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163007.4795773-2928-141521627598597/AnsiballZ_copy.py'
Jan 23 10:10:07 compute-1 sudo[221293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:07 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68002d40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:07 compute-1 python3.9[221295]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:10:07 compute-1 sudo[221293]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:08.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:08.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:08 compute-1 sudo[221445]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-xosvtcrdumphhceofslhpqjamsbzikmn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163008.340113-2952-27630700968242/AnsiballZ_stat.py'
Jan 23 10:10:08 compute-1 sudo[221445]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:08 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002cb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:08 compute-1 python3.9[221447]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:08 compute-1 sudo[221445]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:09 compute-1 sudo[221597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-uydkxsqjwycriewkfumwlsbcgxusrosn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163009.0390093-2977-42477984145074/AnsiballZ_stat.py'
Jan 23 10:10:09 compute-1 sudo[221597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:09 compute-1 ceph-mon[80126]: pgmap v498: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:09 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:09 compute-1 python3.9[221599]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:09 compute-1 sudo[221597]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:09 compute-1 sudo[221721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-sacmkkmvofxoczoxpigaovniwerhodqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163009.0390093-2977-42477984145074/AnsiballZ_copy.py'
Jan 23 10:10:09 compute-1 sudo[221721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:09 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:10 compute-1 python3.9[221723]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1769163009.0390093-2977-42477984145074/.source _original_basename=.1srvojmh follow=False checksum=43aa8ea3ed4ec99d1d20bccd165c6d046c0b601f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Jan 23 10:10:10 compute-1 sudo[221721]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:10.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:10.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:10 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:10 compute-1 python3.9[221875]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:11 compute-1 sudo[221982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:10:11 compute-1 sudo[221982]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:11 compute-1 sudo[221982]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:11 compute-1 sudo[222043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 10:10:11 compute-1 sudo[222043]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:11 compute-1 ceph-mon[80126]: pgmap v499: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:11 compute-1 python3.9[222061]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:11 compute-1 sudo[222043]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:11 compute-1 sudo[222170]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:10:11 compute-1 sudo[222170]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:11 compute-1 sudo[222170]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:11 compute-1 sudo[222218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:10:11 compute-1 sudo[222218]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:12 compute-1 python3.9[222270]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163011.1739957-3054-64684880345036/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=53b8456782b81b5794d3eef3fadcfb00db1088a8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:12.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:12 compute-1 sudo[222218]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:12.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:12 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:12 compute-1 ceph-mon[80126]: pgmap v500: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:10:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:10:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:10:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:10:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:10:13 compute-1 python3.9[222451]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Jan 23 10:10:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:13 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:13 compute-1 python3.9[222572]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769163012.319673-3099-110490860432733/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=0333d3a3f5c3a0526b0ebe430250032166710e8a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Jan 23 10:10:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:13 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:14.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:14 compute-1 sudo[222723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ldxsjxjfcgivtpexgwkkxgmlblkaojxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163014.010203-3150-250800881669351/AnsiballZ_container_config_data.py'
Jan 23 10:10:14 compute-1 sudo[222723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:14.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:14 compute-1 python3.9[222725]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Jan 23 10:10:14 compute-1 sudo[222723]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:14 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:15 compute-1 sudo[222876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-bckwdyvurrtrepunlafvlnbtykhqlull ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163015.0629876-3183-35910230017990/AnsiballZ_container_config_hash.py'
Jan 23 10:10:15 compute-1 sudo[222876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:15 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:15 compute-1 ceph-mon[80126]: pgmap v501: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:15 compute-1 python3.9[222878]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 10:10:15 compute-1 sudo[222876]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:15 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:16.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:16.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:16 compute-1 sudo[223048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-vxmhxchbafvishgkahectewyvuicppri ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769163016.0626085-3213-244814404313137/AnsiballZ_edpm_container_manage.py'
Jan 23 10:10:16 compute-1 sudo[223048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:16 compute-1 podman[222997]: 2026-01-23 10:10:16.715716777 +0000 UTC m=+0.113123783 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:10:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:16 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:16 compute-1 ceph-mon[80126]: pgmap v502: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:10:17 compute-1 python3[223054]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 10:10:17 compute-1 sudo[223084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:10:17 compute-1 sudo[223084]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:17 compute-1 sudo[223084]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:18 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:18 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:10:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:18.259 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:18.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:18 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:19 compute-1 ceph-mon[80126]: pgmap v503: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003e40 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:10:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:20.262 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:20.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:20 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:21 compute-1 ceph-mon[80126]: pgmap v504: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:21 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:21 compute-1 sudo[223152]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:10:21 compute-1 sudo[223152]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:21 compute-1 sudo[223152]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:21 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:22.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:22.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:22 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:23 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:23 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:24.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:24.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:24 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:25 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:25 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:26.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:26.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:26 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:27 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:27 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:28 compute-1 podman[223071]: 2026-01-23 10:10:28.120728152 +0000 UTC m=+11.044589546 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 10:10:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:28.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:28 compute-1 podman[223218]: 2026-01-23 10:10:28.253014872 +0000 UTC m=+0.024693825 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 10:10:28 compute-1 podman[223218]: 2026-01-23 10:10:28.395198245 +0000 UTC m=+0.166877108 container create cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.license=GPLv2, config_id=edpm, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 10:10:28 compute-1 python3[223054]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Jan 23 10:10:28 compute-1 sudo[223048]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:28.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:28 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:29 compute-1 ceph-mon[80126]: pgmap v505: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:29 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:29 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:30 compute-1 sudo[223408]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rvpghzoaeidsezdzojfgflltsorofajp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163029.967063-3237-278634218641596/AnsiballZ_stat.py'
Jan 23 10:10:30 compute-1 sudo[223408]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:30.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:30 compute-1 ceph-mon[80126]: pgmap v506: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:30 compute-1 ceph-mon[80126]: pgmap v507: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:10:30 compute-1 ceph-mon[80126]: pgmap v508: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:30 compute-1 python3.9[223410]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:30 compute-1 sudo[223408]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:30.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:30 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:31 compute-1 sudo[223575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ayllwdshzyzrexuqlpuvlwoywjyhnmjh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163031.1828482-3273-71048336781898/AnsiballZ_container_config_data.py'
Jan 23 10:10:31 compute-1 sudo[223575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:31 compute-1 podman[223536]: 2026-01-23 10:10:31.465560369 +0000 UTC m=+0.058228109 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:10:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:31 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:31 compute-1 ceph-mon[80126]: pgmap v509: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:31 compute-1 python3.9[223582]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Jan 23 10:10:31 compute-1 sudo[223575]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:31 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:32.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:32 compute-1 sudo[223733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-pebpqmutznbheemfrztnedlzugeeyxwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163032.0535672-3306-165115715343317/AnsiballZ_container_config_hash.py'
Jan 23 10:10:32 compute-1 sudo[223733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:32 compute-1 python3.9[223735]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Jan 23 10:10:32 compute-1 sudo[223733]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:32.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:32 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88001ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:33 compute-1 sudo[223885]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-ivrnupskrxgwaqedzfoozsppfznowqip ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769163032.8884466-3336-47813314180766/AnsiballZ_edpm_container_manage.py'
Jan 23 10:10:33 compute-1 sudo[223885]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:33 compute-1 python3[223887]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Jan 23 10:10:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:33 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:33 compute-1 podman[223923]: 2026-01-23 10:10:33.634669604 +0000 UTC m=+0.051720912 container create d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible)
Jan 23 10:10:33 compute-1 podman[223923]: 2026-01-23 10:10:33.605599342 +0000 UTC m=+0.022650670 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b
Jan 23 10:10:33 compute-1 python3[223887]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b kolla_start
Jan 23 10:10:33 compute-1 sudo[223885]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:33 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:34.285 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:34.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:34 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:35 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:35 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:36.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:36.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:36 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:37 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:37 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:38.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:38.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:38 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:39 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:39 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:40.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:40 compute-1 ceph-mon[80126]: pgmap v510: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:40.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:40 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:40 compute-1 sudo[224112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-rcyzvmwofvbxayqkqxzvrldkryipwmyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163040.5944202-3360-149265892668874/AnsiballZ_stat.py'
Jan 23 10:10:40 compute-1 sudo[224112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:41 compute-1 python3.9[224114]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:41 compute-1 sudo[224112]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:41 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:41 compute-1 sudo[224221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:10:41 compute-1 sudo[224221]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:10:41 compute-1 sudo[224221]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:41 compute-1 sudo[224292]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mhxpcpjvcfyizyyvtjnstrzakarvxdsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163041.3977783-3387-224411499426321/AnsiballZ_file.py'
Jan 23 10:10:41 compute-1 sudo[224292]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:41 compute-1 python3.9[224294]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:10:41 compute-1 sudo[224292]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:41 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:42 compute-1 ceph-mon[80126]: pgmap v511: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:42 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:10:42 compute-1 ceph-mon[80126]: pgmap v512: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:10:42 compute-1 ceph-mon[80126]: pgmap v513: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:42 compute-1 ceph-mon[80126]: pgmap v514: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:42 compute-1 sudo[224443]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-mcitazkkpgnygygwtcfztltjjepzegrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163041.9171777-3387-71067633594809/AnsiballZ_copy.py'
Jan 23 10:10:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:42.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:42 compute-1 sudo[224443]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:42 compute-1 python3.9[224445]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769163041.9171777-3387-71067633594809/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Jan 23 10:10:42 compute-1 sudo[224443]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:42.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:42 compute-1 sudo[224519]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-wbbyyarhkxlbylgtzocnmbvwinzehthr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163041.9171777-3387-71067633594809/AnsiballZ_systemd.py'
Jan 23 10:10:42 compute-1 sudo[224519]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:42 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:43 compute-1 python3.9[224521]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Jan 23 10:10:43 compute-1 systemd[1]: Reloading.
Jan 23 10:10:43 compute-1 systemd-sysv-generator[224550]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:10:43 compute-1 systemd-rc-local-generator[224539]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:10:43 compute-1 sudo[224519]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:43 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:43 compute-1 sudo[224630]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-fpibdwslrteeaqlcyaiktybhxufvzgah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163041.9171777-3387-71067633594809/AnsiballZ_systemd.py'
Jan 23 10:10:43 compute-1 sudo[224630]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:43 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:44 compute-1 python3.9[224632]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Jan 23 10:10:44 compute-1 systemd[1]: Reloading.
Jan 23 10:10:44 compute-1 systemd-rc-local-generator[224661]: /etc/rc.d/rc.local is not marked executable, skipping.
Jan 23 10:10:44 compute-1 systemd-sysv-generator[224665]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Jan 23 10:10:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:44.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:44 compute-1 systemd[1]: Starting nova_compute container...
Jan 23 10:10:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:44.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:44 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:10:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:44 compute-1 podman[224672]: 2026-01-23 10:10:44.716180301 +0000 UTC m=+0.099495599 container init d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:10:44 compute-1 podman[224672]: 2026-01-23 10:10:44.721598513 +0000 UTC m=+0.104913781 container start d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Jan 23 10:10:44 compute-1 podman[224672]: nova_compute
Jan 23 10:10:44 compute-1 nova_compute[224687]: + sudo -E kolla_set_configs
Jan 23 10:10:44 compute-1 systemd[1]: Started nova_compute container.
Jan 23 10:10:44 compute-1 sudo[224630]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:44 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Validating config file
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying service configuration files
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Deleting /etc/ceph
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Creating directory /etc/ceph
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Writing out command to execute
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:44 compute-1 nova_compute[224687]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 10:10:44 compute-1 nova_compute[224687]: ++ cat /run_command
Jan 23 10:10:44 compute-1 nova_compute[224687]: + CMD=nova-compute
Jan 23 10:10:44 compute-1 nova_compute[224687]: + ARGS=
Jan 23 10:10:44 compute-1 nova_compute[224687]: + sudo kolla_copy_cacerts
Jan 23 10:10:44 compute-1 nova_compute[224687]: + [[ ! -n '' ]]
Jan 23 10:10:44 compute-1 nova_compute[224687]: + . kolla_extend_start
Jan 23 10:10:44 compute-1 nova_compute[224687]: Running command: 'nova-compute'
Jan 23 10:10:44 compute-1 nova_compute[224687]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 10:10:44 compute-1 nova_compute[224687]: + umask 0022
Jan 23 10:10:44 compute-1 nova_compute[224687]: + exec nova-compute
Jan 23 10:10:45 compute-1 ceph-mon[80126]: pgmap v515: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:45 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101045 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:10:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:45 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:46 compute-1 ceph-mon[80126]: pgmap v516: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:46.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:46.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:46 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:47 compute-1 nova_compute[224687]: 2026-01-23 10:10:47.084 224691 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:47 compute-1 nova_compute[224687]: 2026-01-23 10:10:47.084 224691 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:47 compute-1 nova_compute[224687]: 2026-01-23 10:10:47.084 224691 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:47 compute-1 nova_compute[224687]: 2026-01-23 10:10:47.084 224691 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 23 10:10:47 compute-1 nova_compute[224687]: 2026-01-23 10:10:47.239 224691 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:47 compute-1 nova_compute[224687]: 2026-01-23 10:10:47.275 224691 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:47 compute-1 nova_compute[224687]: 2026-01-23 10:10:47.276 224691 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 23 10:10:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:10:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Cumulative writes: 3512 writes, 20K keys, 3512 commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.04 MB/s
                                           Cumulative WAL: 3512 writes, 3512 syncs, 1.00 writes per sync, written: 0.05 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1356 writes, 6394 keys, 1356 commit groups, 1.0 writes per commit group, ingest: 16.20 MB, 0.03 MB/s
                                           Interval WAL: 1356 writes, 1356 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     47.1      0.59              0.08         9    0.065       0      0       0.0       0.0
                                             L6      1/0   12.81 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   3.5    135.0    117.2      0.84              0.29         8    0.105     39K   4175       0.0       0.0
                                            Sum      1/0   12.81 MB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   4.5     79.2     88.2      1.43              0.37        17    0.084     39K   4175       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   6.1    127.6    128.2      0.34              0.13         6    0.057     16K   1877       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   0.0    135.0    117.2      0.84              0.29         8    0.105     39K   4175       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     47.2      0.59              0.08         8    0.073       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1200.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.027, interval 0.007
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.12 GB write, 0.10 MB/s write, 0.11 GB read, 0.09 MB/s read, 1.4 seconds
                                           Interval compaction: 0.04 GB write, 0.07 MB/s write, 0.04 GB read, 0.07 MB/s read, 0.3 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 4.88 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000142 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(262,4.55 MB,1.4973%) FilterBlock(17,118.48 KB,0.0380616%) IndexBlock(17,221.48 KB,0.0711491%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:10:47 compute-1 ceph-mon[80126]: pgmap v517: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:47 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:47 compute-1 podman[224829]: 2026-01-23 10:10:47.60490502 +0000 UTC m=+0.092729145 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:10:47 compute-1 python3.9[224866]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:47 compute-1 nova_compute[224687]: 2026-01-23 10:10:47.802 224691 INFO nova.virt.driver [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 23 10:10:47 compute-1 nova_compute[224687]: 2026-01-23 10:10:47.933 224691 INFO nova.compute.provider_config [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 23 10:10:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:47 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.064 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.065 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.066 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.067 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.068 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.069 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.070 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.071 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.072 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.073 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.074 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.075 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.076 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.077 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.078 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.079 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.080 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.081 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.082 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.083 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.084 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.085 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.086 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.087 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.088 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.089 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.090 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.091 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.092 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.093 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.094 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.095 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.096 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.097 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.098 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.099 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.100 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.101 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.102 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.103 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.104 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.105 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.106 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.107 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.108 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.109 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.110 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.111 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.112 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.113 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.114 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.115 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.116 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.117 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.118 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.119 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.120 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.121 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.122 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.123 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.124 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.125 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.126 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.127 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.128 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.129 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.130 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.131 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.132 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.133 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.134 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.134 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.134 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.134 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.135 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.136 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.137 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.138 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.139 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.140 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.141 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.142 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.143 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.143 224691 WARNING oslo_config.cfg [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 10:10:48 compute-1 nova_compute[224687]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 10:10:48 compute-1 nova_compute[224687]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 10:10:48 compute-1 nova_compute[224687]: and ``live_migration_inbound_addr`` respectively.
Jan 23 10:10:48 compute-1 nova_compute[224687]: ).  Its value may be silently ignored in the future.
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.143 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.143 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.144 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.145 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.146 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_secret_uuid        = f3005f84-239a-55b6-a948-8f1fb592b920 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.147 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.148 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.149 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.150 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.151 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.152 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.152 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.152 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.152 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.153 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.154 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.155 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.156 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.157 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.158 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.159 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.160 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.161 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.162 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.163 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.164 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.165 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.166 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.167 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.168 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.169 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.169 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.169 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.169 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.170 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.171 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.172 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.173 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.174 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.175 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.176 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.176 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.176 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.176 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.177 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.178 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.179 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.180 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.180 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.180 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.180 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.181 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.182 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.183 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.184 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.185 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.186 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.187 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.188 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.189 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.189 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.189 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.189 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.190 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.190 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.190 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.190 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.191 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.192 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.193 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.194 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.195 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.196 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.197 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.198 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.199 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.200 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.200 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.200 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.200 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.201 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.202 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.203 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.203 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.203 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.204 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.205 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.206 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.207 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.208 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.209 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.210 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.211 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.212 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.213 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.214 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.215 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.216 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.217 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.218 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.219 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.220 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.221 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.222 224691 DEBUG oslo_service.service [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.224 224691 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.292 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.292 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.293 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.293 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 23 10:10:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:48.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:48 compute-1 systemd[1]: Starting libvirt QEMU daemon...
Jan 23 10:10:48 compute-1 systemd[1]: Started libvirt QEMU daemon.
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.375 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f0b935f8880> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.380 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f0b935f8880> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.382 224691 INFO nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Connection event '1' reason 'None'
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.512 224691 WARNING nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Cannot update service status on host "compute-1.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 23 10:10:48 compute-1 nova_compute[224687]: 2026-01-23 10:10:48.512 224691 DEBUG nova.virt.libvirt.volume.mount [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 23 10:10:48 compute-1 python3.9[225074]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:10:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:48.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:10:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:48 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.243 224691 INFO nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]: 
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <host>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <uuid>53821a39-1f4a-4bf2-b036-ba3044ea8780</uuid>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <arch>x86_64</arch>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model>EPYC-Rome-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <vendor>AMD</vendor>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <microcode version='16777317'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <signature family='23' model='49' stepping='0'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='x2apic'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='tsc-deadline'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='osxsave'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='hypervisor'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='tsc_adjust'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='spec-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='stibp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='arch-capabilities'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='cmp_legacy'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='topoext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='virt-ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='lbrv'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='tsc-scale'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='vmcb-clean'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='pause-filter'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='pfthreshold'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='svme-addr-chk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='rdctl-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='skip-l1dfl-vmentry'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='mds-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature name='pschange-mc-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <pages unit='KiB' size='4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <pages unit='KiB' size='2048'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <pages unit='KiB' size='1048576'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <power_management>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <suspend_mem/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </power_management>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <iommu support='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <migration_features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <live/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <uri_transports>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <uri_transport>tcp</uri_transport>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <uri_transport>rdma</uri_transport>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </uri_transports>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </migration_features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <topology>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <cells num='1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <cell id='0'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:           <memory unit='KiB'>7864316</memory>
Jan 23 10:10:49 compute-1 nova_compute[224687]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 23 10:10:49 compute-1 nova_compute[224687]:           <pages unit='KiB' size='2048'>0</pages>
Jan 23 10:10:49 compute-1 nova_compute[224687]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 23 10:10:49 compute-1 nova_compute[224687]:           <distances>
Jan 23 10:10:49 compute-1 nova_compute[224687]:             <sibling id='0' value='10'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:           </distances>
Jan 23 10:10:49 compute-1 nova_compute[224687]:           <cpus num='8'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:           </cpus>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         </cell>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </cells>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </topology>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <cache>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </cache>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <secmodel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model>selinux</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <doi>0</doi>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </secmodel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <secmodel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model>dac</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <doi>0</doi>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </secmodel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </host>
Jan 23 10:10:49 compute-1 nova_compute[224687]: 
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <guest>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <os_type>hvm</os_type>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <arch name='i686'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <wordsize>32</wordsize>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <domain type='qemu'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <domain type='kvm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </arch>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <pae/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <nonpae/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <acpi default='on' toggle='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <apic default='on' toggle='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <cpuselection/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <deviceboot/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <disksnapshot default='on' toggle='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <externalSnapshot/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </guest>
Jan 23 10:10:49 compute-1 nova_compute[224687]: 
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <guest>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <os_type>hvm</os_type>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <arch name='x86_64'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <wordsize>64</wordsize>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <domain type='qemu'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <domain type='kvm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </arch>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <acpi default='on' toggle='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <apic default='on' toggle='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <cpuselection/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <deviceboot/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <disksnapshot default='on' toggle='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <externalSnapshot/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </guest>
Jan 23 10:10:49 compute-1 nova_compute[224687]: 
Jan 23 10:10:49 compute-1 nova_compute[224687]: </capabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]: 
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.249 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.270 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 10:10:49 compute-1 nova_compute[224687]: <domainCapabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <domain>kvm</domain>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <arch>i686</arch>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <vcpu max='4096'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <iothreads supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <os supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <enum name='firmware'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <loader supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>rom</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pflash</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='readonly'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>yes</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>no</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='secure'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>no</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </loader>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </os>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>on</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>off</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='maximumMigratable'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>on</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>off</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <vendor>AMD</vendor>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='succor'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='custom' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ddpd-u'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sha512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ddpd-u'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sha512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbpb'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbpb'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-128'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-256'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-128'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-256'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='KnightsMill'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512er'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512pf'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512er'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512pf'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tbm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tbm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='athlon'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='athlon-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='core2duo'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='core2duo-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='coreduo'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='coreduo-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='n270'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='n270-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='phenom'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='phenom-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <memoryBacking supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <enum name='sourceType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>file</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>anonymous</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>memfd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </memoryBacking>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <devices>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <disk supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='diskDevice'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>disk</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>cdrom</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>floppy</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>lun</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='bus'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>fdc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>scsi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>sata</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-non-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </disk>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <graphics supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vnc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>egl-headless</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dbus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </graphics>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <video supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='modelType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vga</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>cirrus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>none</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>bochs</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ramfb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </video>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <hostdev supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='mode'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>subsystem</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='startupPolicy'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>default</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>mandatory</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>requisite</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>optional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='subsysType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pci</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>scsi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='capsType'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='pciBackend'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </hostdev>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <rng supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-non-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>random</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>egd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>builtin</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </rng>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <filesystem supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='driverType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>path</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>handle</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtiofs</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </filesystem>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <tpm supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tpm-tis</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tpm-crb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>emulator</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>external</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendVersion'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>2.0</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </tpm>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <redirdev supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='bus'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </redirdev>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <channel supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pty</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>unix</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </channel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <crypto supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>qemu</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>builtin</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </crypto>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <interface supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>default</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>passt</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </interface>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <panic supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>isa</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>hyperv</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </panic>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <console supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>null</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pty</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dev</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>file</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pipe</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>stdio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>udp</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tcp</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>unix</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>qemu-vdagent</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dbus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </console>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </devices>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <gic supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <genid supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <backup supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <async-teardown supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <s390-pv supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <ps2 supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <tdx supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <sev supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <sgx supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <hyperv supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='features'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>relaxed</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vapic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>spinlocks</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vpindex</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>runtime</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>synic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>stimer</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>reset</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vendor_id</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>frequencies</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>reenlightenment</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tlbflush</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ipi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>avic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>emsr_bitmap</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>xmm_input</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <defaults>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </defaults>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </hyperv>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <launchSecurity supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </features>
Jan 23 10:10:49 compute-1 nova_compute[224687]: </domainCapabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.282 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 10:10:49 compute-1 nova_compute[224687]: <domainCapabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <domain>kvm</domain>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <arch>i686</arch>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <vcpu max='240'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <iothreads supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <os supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <enum name='firmware'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <loader supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>rom</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pflash</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='readonly'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>yes</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>no</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='secure'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>no</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </loader>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </os>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>on</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>off</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='maximumMigratable'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>on</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>off</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <vendor>AMD</vendor>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='succor'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='custom' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ddpd-u'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sha512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ddpd-u'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sha512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbpb'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbpb'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-128'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-256'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-128'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-256'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='KnightsMill'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512er'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512pf'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512er'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512pf'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tbm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tbm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='athlon'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='athlon-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='core2duo'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='core2duo-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='coreduo'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='coreduo-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='n270'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='n270-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='phenom'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='phenom-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <memoryBacking supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <enum name='sourceType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>file</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>anonymous</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>memfd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </memoryBacking>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <devices>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <disk supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='diskDevice'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>disk</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>cdrom</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>floppy</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>lun</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='bus'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ide</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>fdc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>scsi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>sata</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-non-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </disk>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <graphics supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vnc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>egl-headless</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dbus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </graphics>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <video supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='modelType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vga</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>cirrus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>none</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>bochs</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ramfb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </video>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <hostdev supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='mode'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>subsystem</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='startupPolicy'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>default</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>mandatory</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>requisite</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>optional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='subsysType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pci</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>scsi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='capsType'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='pciBackend'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </hostdev>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <rng supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-non-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>random</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>egd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>builtin</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </rng>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <filesystem supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='driverType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>path</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>handle</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtiofs</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </filesystem>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <tpm supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tpm-tis</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tpm-crb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>emulator</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>external</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendVersion'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>2.0</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </tpm>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <redirdev supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='bus'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </redirdev>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <channel supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pty</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>unix</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </channel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <crypto supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>qemu</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>builtin</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </crypto>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <interface supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>default</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>passt</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </interface>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <panic supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>isa</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>hyperv</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </panic>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <console supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>null</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pty</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dev</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>file</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pipe</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>stdio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>udp</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tcp</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>unix</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>qemu-vdagent</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dbus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </console>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </devices>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <gic supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <genid supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <backup supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <async-teardown supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <s390-pv supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <ps2 supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <tdx supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <sev supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <sgx supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <hyperv supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='features'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>relaxed</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vapic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>spinlocks</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vpindex</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>runtime</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>synic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>stimer</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>reset</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vendor_id</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>frequencies</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>reenlightenment</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tlbflush</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ipi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>avic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>emsr_bitmap</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>xmm_input</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <defaults>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </defaults>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </hyperv>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <launchSecurity supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </features>
Jan 23 10:10:49 compute-1 nova_compute[224687]: </domainCapabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.329 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.334 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 10:10:49 compute-1 nova_compute[224687]: <domainCapabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <domain>kvm</domain>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <arch>x86_64</arch>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <vcpu max='240'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <iothreads supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <os supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <enum name='firmware'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <loader supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>rom</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pflash</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='readonly'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>yes</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>no</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='secure'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>no</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </loader>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </os>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>on</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>off</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='maximumMigratable'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>on</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>off</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <vendor>AMD</vendor>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='succor'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='custom' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ddpd-u'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sha512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ddpd-u'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sha512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbpb'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbpb'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-128'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-256'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-128'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-256'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='KnightsMill'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512er'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512pf'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512er'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512pf'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tbm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tbm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='athlon'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='athlon-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='core2duo'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='core2duo-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='coreduo'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='coreduo-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='n270'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='n270-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='phenom'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='phenom-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <memoryBacking supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <enum name='sourceType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>file</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>anonymous</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>memfd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </memoryBacking>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <devices>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <disk supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='diskDevice'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>disk</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>cdrom</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>floppy</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>lun</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='bus'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ide</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>fdc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>scsi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>sata</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-non-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </disk>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <graphics supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vnc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>egl-headless</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dbus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </graphics>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <video supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='modelType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vga</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>cirrus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>none</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>bochs</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ramfb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </video>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <hostdev supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='mode'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>subsystem</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='startupPolicy'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>default</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>mandatory</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>requisite</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>optional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='subsysType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pci</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>scsi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='capsType'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='pciBackend'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </hostdev>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <rng supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-non-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>random</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>egd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>builtin</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </rng>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <filesystem supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='driverType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>path</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>handle</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtiofs</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </filesystem>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <tpm supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tpm-tis</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tpm-crb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>emulator</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>external</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendVersion'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>2.0</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </tpm>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <redirdev supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='bus'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </redirdev>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <channel supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pty</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>unix</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </channel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <crypto supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>qemu</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>builtin</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </crypto>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <interface supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>default</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>passt</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </interface>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <panic supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>isa</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>hyperv</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </panic>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <console supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>null</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pty</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dev</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>file</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pipe</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>stdio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>udp</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tcp</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>unix</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>qemu-vdagent</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dbus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </console>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </devices>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <gic supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <genid supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <backup supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <async-teardown supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <s390-pv supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <ps2 supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <tdx supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <sev supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <sgx supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <hyperv supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='features'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>relaxed</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vapic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>spinlocks</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vpindex</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>runtime</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>synic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>stimer</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>reset</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vendor_id</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>frequencies</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>reenlightenment</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tlbflush</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ipi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>avic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>emsr_bitmap</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>xmm_input</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <defaults>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </defaults>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </hyperv>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <launchSecurity supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </features>
Jan 23 10:10:49 compute-1 nova_compute[224687]: </domainCapabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.405 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 10:10:49 compute-1 nova_compute[224687]: <domainCapabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <domain>kvm</domain>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <arch>x86_64</arch>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <vcpu max='4096'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <iothreads supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <os supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <enum name='firmware'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>efi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <loader supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>rom</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pflash</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='readonly'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>yes</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>no</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='secure'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>yes</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>no</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </loader>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </os>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>on</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>off</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='maximumMigratable'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>on</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>off</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <vendor>AMD</vendor>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='succor'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <mode name='custom' supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ddpd-u'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sha512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ddpd-u'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sha512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm3'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sm4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Denverton-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbpb'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amd-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='auto-ibrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='perfmon-v2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbpb'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='stibp-always-on'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='EPYC-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-128'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-256'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-128'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-256'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx10-512'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='prefetchiti'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v1'>
Jan 23 10:10:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:49 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Haswell-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='KnightsMill'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512er'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512pf'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512er'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512pf'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tbm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fma4'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tbm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xop'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='amx-tile'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-bf16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-fp16'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bitalg'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrc'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fzrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='la57'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='taa-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ifma'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cmpccxadd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fbsdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='fsrs'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ibrs-all'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='intel-psfd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='lam'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mcdt-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pbrsb-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='psdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='serialize'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vaes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='hle'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='rtm'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512bw'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512cd'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512dq'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512f'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='avx512vl'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='invpcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pcid'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='pku'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='mpx'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='core-capability'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='split-lock-detect'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='cldemote'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='erms'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='gfni'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdir64b'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='movdiri'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='xsaves'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='athlon'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='athlon-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='core2duo'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='core2duo-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='coreduo'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='coreduo-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='n270'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='n270-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='ss'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='phenom'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <blockers model='phenom-v1'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnow'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <feature name='3dnowext'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </blockers>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </mode>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </cpu>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <memoryBacking supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <enum name='sourceType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>file</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>anonymous</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <value>memfd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </memoryBacking>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <devices>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <disk supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='diskDevice'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>disk</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>cdrom</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>floppy</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>lun</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='bus'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>fdc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>scsi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>sata</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-non-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </disk>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <graphics supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vnc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>egl-headless</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dbus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </graphics>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <video supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='modelType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vga</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>cirrus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>none</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>bochs</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ramfb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </video>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <hostdev supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='mode'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>subsystem</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='startupPolicy'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>default</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>mandatory</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>requisite</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>optional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='subsysType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pci</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>scsi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='capsType'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='pciBackend'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </hostdev>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <rng supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtio-non-transitional</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>random</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>egd</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>builtin</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </rng>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <filesystem supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='driverType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>path</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>handle</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>virtiofs</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </filesystem>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <tpm supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tpm-tis</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tpm-crb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>emulator</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>external</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendVersion'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>2.0</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </tpm>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <redirdev supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='bus'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>usb</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </redirdev>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <channel supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pty</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>unix</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </channel>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <crypto supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>qemu</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendModel'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>builtin</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </crypto>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <interface supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='backendType'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>default</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>passt</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </interface>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <panic supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='model'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>isa</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>hyperv</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </panic>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <console supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='type'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>null</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vc</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pty</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dev</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>file</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>pipe</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>stdio</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>udp</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tcp</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>unix</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>qemu-vdagent</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>dbus</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </console>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </devices>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   <features>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <gic supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <genid supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <backup supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <async-teardown supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <s390-pv supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <ps2 supported='yes'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <tdx supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <sev supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <sgx supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <hyperv supported='yes'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <enum name='features'>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>relaxed</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vapic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>spinlocks</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vpindex</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>runtime</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>synic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>stimer</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>reset</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>vendor_id</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>frequencies</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>reenlightenment</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>tlbflush</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>ipi</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>avic</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>emsr_bitmap</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <value>xmm_input</value>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </enum>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       <defaults>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:49 compute-1 nova_compute[224687]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:49 compute-1 nova_compute[224687]:       </defaults>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     </hyperv>
Jan 23 10:10:49 compute-1 nova_compute[224687]:     <launchSecurity supported='no'/>
Jan 23 10:10:49 compute-1 nova_compute[224687]:   </features>
Jan 23 10:10:49 compute-1 nova_compute[224687]: </domainCapabilities>
Jan 23 10:10:49 compute-1 nova_compute[224687]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.473 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.474 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.474 224691 DEBUG nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.478 224691 INFO nova.virt.libvirt.host [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Secure Boot support detected
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.480 224691 INFO nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.481 224691 INFO nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.494 224691 DEBUG nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 23 10:10:49 compute-1 python3.9[225242]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.704 224691 INFO nova.virt.node [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Determined node identity b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from /var/lib/nova/compute_id
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.730 224691 WARNING nova.compute.manager [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Compute nodes ['b22b6ed5-7bca-42dc-9b99-6f2ad6853af7'] for host compute-1.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.790 224691 INFO nova.compute.manager [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.833 224691 WARNING nova.compute.manager [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] No compute node record found for host compute-1.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-1.ctlplane.example.com could not be found.
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.834 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.834 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.835 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.835 224691 DEBUG nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:10:49 compute-1 nova_compute[224687]: 2026-01-23 10:10:49.835 224691 DEBUG oslo_concurrency.processutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:49 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c70004050 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:49 compute-1 ceph-mon[80126]: pgmap v518: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:50.309 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:50 compute-1 sudo[225417]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-drdbwhoschiimrrhpyjpnifaxjsyjfce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163049.9490302-3567-719334315765/AnsiballZ_podman_container.py'
Jan 23 10:10:50 compute-1 sudo[225417]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:50.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:50 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:10:50 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1010858224' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:50 compute-1 nova_compute[224687]: 2026-01-23 10:10:50.659 224691 DEBUG oslo_concurrency.processutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.823s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:50 compute-1 systemd[1]: Starting libvirt nodedev daemon...
Jan 23 10:10:50 compute-1 systemd[1]: Started libvirt nodedev daemon.
Jan 23 10:10:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:50 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_8] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c003db0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:50 compute-1 python3.9[225419]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 10:10:50 compute-1 sudo[225417]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:50 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:10:50 compute-1 nova_compute[224687]: 2026-01-23 10:10:50.973 224691 WARNING nova.virt.libvirt.driver [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:10:50 compute-1 nova_compute[224687]: 2026-01-23 10:10:50.974 224691 DEBUG nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5226MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:10:50 compute-1 nova_compute[224687]: 2026-01-23 10:10:50.975 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:10:50 compute-1 nova_compute[224687]: 2026-01-23 10:10:50.975 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:10:50 compute-1 nova_compute[224687]: 2026-01-23 10:10:50.995 224691 WARNING nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] No compute node record for compute-1.ctlplane.example.com:b22b6ed5-7bca-42dc-9b99-6f2ad6853af7: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 could not be found.
Jan 23 10:10:51 compute-1 nova_compute[224687]: 2026-01-23 10:10:51.026 224691 INFO nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Compute node record created for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com with uuid: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7
Jan 23 10:10:51 compute-1 nova_compute[224687]: 2026-01-23 10:10:51.080 224691 DEBUG nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:10:51 compute-1 nova_compute[224687]: 2026-01-23 10:10:51.081 224691 DEBUG nova.compute.resource_tracker [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:10:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:51 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:51 compute-1 sudo[225617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-kzdgqbefkxgzbyfsxdxbgmovtplgevld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163051.1872256-3591-199169830884200/AnsiballZ_systemd.py'
Jan 23 10:10:51 compute-1 sudo[225617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:51 compute-1 nova_compute[224687]: 2026-01-23 10:10:51.618 224691 INFO nova.scheduler.client.report [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] [req-95542ec9-2546-4865-880f-0d0f3dd71826] Created resource provider record via placement API for resource provider with UUID b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 and name compute-1.ctlplane.example.com.
Jan 23 10:10:51 compute-1 nova_compute[224687]: 2026-01-23 10:10:51.641 224691 DEBUG oslo_concurrency.processutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:51 compute-1 python3.9[225620]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Jan 23 10:10:51 compute-1 systemd[1]: Stopping nova_compute container...
Jan 23 10:10:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:51 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:52 compute-1 nova_compute[224687]: 2026-01-23 10:10:52.104 224691 DEBUG oslo_concurrency.lockutils [None req-4ede91fc-ab55-4883-a4c2-330efb14898f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.129s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:10:52 compute-1 nova_compute[224687]: 2026-01-23 10:10:52.105 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:10:52 compute-1 nova_compute[224687]: 2026-01-23 10:10:52.106 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:10:52 compute-1 nova_compute[224687]: 2026-01-23 10:10:52.106 224691 DEBUG oslo_concurrency.lockutils [None req-4e6f8702-4b29-4c7c-b47e-3eaf62136f82 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:10:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:52.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:52 compute-1 systemd[1]: libpod-d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4.scope: Deactivated successfully.
Jan 23 10:10:52 compute-1 systemd[1]: libpod-d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4.scope: Consumed 4.409s CPU time.
Jan 23 10:10:52 compute-1 virtqemud[225011]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Jan 23 10:10:52 compute-1 podman[225644]: 2026-01-23 10:10:52.577787507 +0000 UTC m=+0.721428611 container died d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:10:52 compute-1 virtqemud[225011]: hostname: compute-1
Jan 23 10:10:52 compute-1 virtqemud[225011]: End of file while reading data: Input/output error
Jan 23 10:10:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:52.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:52 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:52 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4-userdata-shm.mount: Deactivated successfully.
Jan 23 10:10:52 compute-1 systemd[1]: var-lib-containers-storage-overlay-4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576-merged.mount: Deactivated successfully.
Jan 23 10:10:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:53 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:10:53 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/738253789' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:53 compute-1 ceph-mon[80126]: pgmap v519: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:53 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1010858224' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:53 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:54 compute-1 podman[225644]: 2026-01-23 10:10:54.011476898 +0000 UTC m=+2.155117982 container cleanup d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, config_id=edpm, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:10:54 compute-1 podman[225644]: nova_compute
Jan 23 10:10:54 compute-1 podman[225676]: nova_compute
Jan 23 10:10:54 compute-1 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Jan 23 10:10:54 compute-1 systemd[1]: Stopped nova_compute container.
Jan 23 10:10:54 compute-1 systemd[1]: Starting nova_compute container...
Jan 23 10:10:54 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:10:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4a57c491e8a1c4d98872175840f2ad7847c9a870df161144b73b0d9908b7d576/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:54.316 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:54 compute-1 podman[225689]: 2026-01-23 10:10:54.464315282 +0000 UTC m=+0.346538661 container init d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 10:10:54 compute-1 podman[225689]: 2026-01-23 10:10:54.471283504 +0000 UTC m=+0.353506863 container start d89099556b5fe4e5dce4e0510671bc125797e2bf099f5152a8ae30e75c84dca4 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:10:54 compute-1 nova_compute[225705]: + sudo -E kolla_set_configs
Jan 23 10:10:54 compute-1 podman[225689]: nova_compute
Jan 23 10:10:54 compute-1 systemd[1]: Started nova_compute container.
Jan 23 10:10:54 compute-1 sudo[225617]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Validating config file
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying service configuration files
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /etc/ceph
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Creating directory /etc/ceph
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Writing out command to execute
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:54 compute-1 nova_compute[225705]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Jan 23 10:10:54 compute-1 nova_compute[225705]: ++ cat /run_command
Jan 23 10:10:54 compute-1 nova_compute[225705]: + CMD=nova-compute
Jan 23 10:10:54 compute-1 nova_compute[225705]: + ARGS=
Jan 23 10:10:54 compute-1 nova_compute[225705]: + sudo kolla_copy_cacerts
Jan 23 10:10:54 compute-1 nova_compute[225705]: + [[ ! -n '' ]]
Jan 23 10:10:54 compute-1 nova_compute[225705]: + . kolla_extend_start
Jan 23 10:10:54 compute-1 nova_compute[225705]: Running command: 'nova-compute'
Jan 23 10:10:54 compute-1 nova_compute[225705]: + echo 'Running command: '\''nova-compute'\'''
Jan 23 10:10:54 compute-1 nova_compute[225705]: + umask 0022
Jan 23 10:10:54 compute-1 nova_compute[225705]: + exec nova-compute
Jan 23 10:10:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:54.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:54 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c74001090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:54 compute-1 ceph-mon[80126]: pgmap v520: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:10:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1198102383' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:54 compute-1 ceph-mon[80126]: pgmap v521: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:10:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:10:55.037 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:10:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:10:55.038 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:10:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:10:55.038 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:10:55 compute-1 sudo[225866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c 'echo BECOME-SUCCESS-abjafjpjoljnupefbdnfohgzjkfflqnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769163054.777499-3618-50797644342054/AnsiballZ_podman_container.py'
Jan 23 10:10:55 compute-1 sudo[225866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:10:55 compute-1 python3.9[225868]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Jan 23 10:10:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:55 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80001080 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:55 compute-1 systemd[1]: Started libpod-conmon-cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7.scope.
Jan 23 10:10:55 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:10:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cfec93db089755fabff22afa1c4174975aa597b02c6eb77e05b5113bcf3326b/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cfec93db089755fabff22afa1c4174975aa597b02c6eb77e05b5113bcf3326b/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:55 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3cfec93db089755fabff22afa1c4174975aa597b02c6eb77e05b5113bcf3326b/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Jan 23 10:10:55 compute-1 podman[225895]: 2026-01-23 10:10:55.585930976 +0000 UTC m=+0.136830794 container init cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 10:10:55 compute-1 podman[225895]: 2026-01-23 10:10:55.59864789 +0000 UTC m=+0.149547678 container start cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Jan 23 10:10:55 compute-1 python3.9[225868]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Applying nova statedir ownership
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Jan 23 10:10:55 compute-1 nova_compute_init[225914]: INFO:nova_statedir:Nova statedir ownership complete
Jan 23 10:10:55 compute-1 systemd[1]: libpod-cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7.scope: Deactivated successfully.
Jan 23 10:10:55 compute-1 podman[225915]: 2026-01-23 10:10:55.702164726 +0000 UTC m=+0.037361307 container died cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 10:10:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:56 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:56 compute-1 sudo[225866]: pam_unix(sudo:session): session closed for user root
Jan 23 10:10:56 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7-userdata-shm.mount: Deactivated successfully.
Jan 23 10:10:56 compute-1 systemd[1]: var-lib-containers-storage-overlay-3cfec93db089755fabff22afa1c4174975aa597b02c6eb77e05b5113bcf3326b-merged.mount: Deactivated successfully.
Jan 23 10:10:56 compute-1 podman[225921]: 2026-01-23 10:10:56.199964148 +0000 UTC m=+0.511166707 container cleanup cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7 (image=quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b, name=nova_compute_init, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:526afed30c44ef41d54d63a4f4db122bc603f775243ae350a59d2e0b5050076b', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible)
Jan 23 10:10:56 compute-1 systemd[1]: libpod-conmon-cb097c245a55b5a03d57d35d70e585213bac12a049aa1f9eea935ed112ceded7.scope: Deactivated successfully.
Jan 23 10:10:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:56.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:56.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:56 compute-1 sshd-session[201650]: Connection closed by 192.168.122.30 port 57496
Jan 23 10:10:56 compute-1 sshd-session[201647]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:10:56 compute-1 systemd[1]: session-53.scope: Deactivated successfully.
Jan 23 10:10:56 compute-1 systemd[1]: session-53.scope: Consumed 2min 4.413s CPU time.
Jan 23 10:10:56 compute-1 systemd-logind[807]: Session 53 logged out. Waiting for processes to exit.
Jan 23 10:10:56 compute-1 systemd-logind[807]: Removed session 53.
Jan 23 10:10:56 compute-1 nova_compute[225705]: 2026-01-23 10:10:56.677 225709 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:56 compute-1 nova_compute[225705]: 2026-01-23 10:10:56.677 225709 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:56 compute-1 nova_compute[225705]: 2026-01-23 10:10:56.677 225709 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Jan 23 10:10:56 compute-1 nova_compute[225705]: 2026-01-23 10:10:56.678 225709 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Jan 23 10:10:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:56 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:56 compute-1 nova_compute[225705]: 2026-01-23 10:10:56.839 225709 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:56 compute-1 nova_compute[225705]: 2026-01-23 10:10:56.862 225709 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:56 compute-1 nova_compute[225705]: 2026-01-23 10:10:56.863 225709 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Jan 23 10:10:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.298 225709 INFO nova.virt.driver [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Jan 23 10:10:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:10:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.406 225709 INFO nova.compute.provider_config [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.415 225709 DEBUG oslo_concurrency.lockutils [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.415 225709 DEBUG oslo_concurrency.lockutils [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_concurrency.lockutils [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.416 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.417 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.418 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] console_host                   = compute-1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.419 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.420 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.421 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] host                           = compute-1.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.422 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.423 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.423 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.423 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.423 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.424 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.424 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.424 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.424 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.425 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.426 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.427 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.428 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.429 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.429 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.429 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.430 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] my_block_storage_ip            = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] my_ip                          = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.431 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.432 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.433 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.434 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.435 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.435 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.435 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.436 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.437 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.438 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.439 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.440 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.441 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.442 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.443 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.444 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.445 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.446 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.447 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.448 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.449 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.450 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.451 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.452 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.453 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.454 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.455 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.456 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.457 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.458 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.459 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.460 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.461 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.461 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.461 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.461 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.462 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.463 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.464 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.465 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.466 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.466 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.466 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.466 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.467 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.468 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.469 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.470 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.471 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.472 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.473 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.474 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.475 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.475 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.475 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.475 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.476 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.477 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.478 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.478 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.478 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.478 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.479 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.480 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.481 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.482 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.486 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.487 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.488 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.489 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.490 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.491 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.492 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.493 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.494 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.495 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.496 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.497 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.498 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.499 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.500 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.501 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.502 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.503 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 WARNING oslo_config.cfg [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Jan 23 10:10:57 compute-1 nova_compute[225705]: live_migration_uri is deprecated for removal in favor of two other options that
Jan 23 10:10:57 compute-1 nova_compute[225705]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Jan 23 10:10:57 compute-1 nova_compute[225705]: and ``live_migration_inbound_addr`` respectively.
Jan 23 10:10:57 compute-1 nova_compute[225705]: ).  Its value may be silently ignored in the future.
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.504 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.505 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.506 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_secret_uuid        = f3005f84-239a-55b6-a948-8f1fb592b920 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.507 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.508 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.509 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.510 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.511 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.512 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.513 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.514 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.515 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.516 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.517 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.518 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.519 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.520 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.521 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.522 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.523 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.524 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.525 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.526 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.527 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.528 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.529 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.530 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.531 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.532 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.533 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.534 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.535 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.536 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.536 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.536 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.536 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.537 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.538 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.539 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.540 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.541 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.542 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.543 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.544 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.545 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.546 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.546 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.546 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.546 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.547 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.547 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.547 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.server_proxyclient_address = 192.168.122.101 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.547 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.548 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.549 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.550 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.551 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.552 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.553 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.554 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.555 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.556 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.557 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.558 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.559 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.560 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.561 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.562 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.563 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.563 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.563 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.563 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.564 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.565 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.566 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.567 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.568 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.569 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 ceph-mon[80126]: pgmap v522: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.570 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.571 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.571 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.572 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.573 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.573 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.573 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.573 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.574 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.575 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.575 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.575 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.575 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.576 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.576 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.576 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.576 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.577 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.578 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.579 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.579 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.579 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.579 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.580 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.581 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.581 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.581 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.581 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.582 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.582 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.582 225709 DEBUG oslo_service.service [None req-d1431bd1-2a43-454b-9aa8-f5b521c13620 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.583 225709 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)
Jan 23 10:10:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.600 225709 INFO nova.virt.node [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Determined node identity b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from /var/lib/nova/compute_id
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.601 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.601 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.601 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.601 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.614 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7ff23e7c0970> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.617 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7ff23e7c0970> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.618 225709 INFO nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Connection event '1' reason 'None'
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.625 225709 INFO nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host capabilities <capabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]: 
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <host>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <uuid>53821a39-1f4a-4bf2-b036-ba3044ea8780</uuid>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <arch>x86_64</arch>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <microcode version='16777317'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <signature family='23' model='49' stepping='0'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <maxphysaddr mode='emulate' bits='40'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='x2apic'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='tsc-deadline'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='osxsave'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='hypervisor'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='tsc_adjust'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='spec-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='stibp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='arch-capabilities'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='cmp_legacy'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='topoext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='virt-ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='lbrv'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='tsc-scale'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='vmcb-clean'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='pause-filter'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='pfthreshold'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='svme-addr-chk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='rdctl-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='skip-l1dfl-vmentry'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='mds-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature name='pschange-mc-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <pages unit='KiB' size='4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <pages unit='KiB' size='2048'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <pages unit='KiB' size='1048576'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <power_management>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <suspend_mem/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </power_management>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <iommu support='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <migration_features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <live/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <uri_transports>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <uri_transport>tcp</uri_transport>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <uri_transport>rdma</uri_transport>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </uri_transports>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </migration_features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <topology>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <cells num='1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <cell id='0'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:           <memory unit='KiB'>7864316</memory>
Jan 23 10:10:57 compute-1 nova_compute[225705]:           <pages unit='KiB' size='4'>1966079</pages>
Jan 23 10:10:57 compute-1 nova_compute[225705]:           <pages unit='KiB' size='2048'>0</pages>
Jan 23 10:10:57 compute-1 nova_compute[225705]:           <pages unit='KiB' size='1048576'>0</pages>
Jan 23 10:10:57 compute-1 nova_compute[225705]:           <distances>
Jan 23 10:10:57 compute-1 nova_compute[225705]:             <sibling id='0' value='10'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:           </distances>
Jan 23 10:10:57 compute-1 nova_compute[225705]:           <cpus num='8'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:           </cpus>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         </cell>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </cells>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </topology>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <cache>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </cache>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <secmodel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model>selinux</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <doi>0</doi>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </secmodel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <secmodel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model>dac</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <doi>0</doi>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <baselabel type='kvm'>+107:+107</baselabel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <baselabel type='qemu'>+107:+107</baselabel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </secmodel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </host>
Jan 23 10:10:57 compute-1 nova_compute[225705]: 
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <guest>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <os_type>hvm</os_type>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <arch name='i686'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <wordsize>32</wordsize>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <domain type='qemu'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <domain type='kvm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </arch>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <pae/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <nonpae/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <acpi default='on' toggle='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <apic default='on' toggle='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <cpuselection/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <deviceboot/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <disksnapshot default='on' toggle='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <externalSnapshot/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </guest>
Jan 23 10:10:57 compute-1 nova_compute[225705]: 
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <guest>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <os_type>hvm</os_type>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <arch name='x86_64'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <wordsize>64</wordsize>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <domain type='qemu'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <domain type='kvm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </arch>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <acpi default='on' toggle='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <apic default='on' toggle='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <cpuselection/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <deviceboot/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <disksnapshot default='on' toggle='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <externalSnapshot/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </guest>
Jan 23 10:10:57 compute-1 nova_compute[225705]: 
Jan 23 10:10:57 compute-1 nova_compute[225705]: </capabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]: 
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.632 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.635 225709 DEBUG nova.virt.libvirt.volume.mount [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.637 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Jan 23 10:10:57 compute-1 nova_compute[225705]: <domainCapabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <domain>kvm</domain>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <arch>i686</arch>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <vcpu max='4096'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <iothreads supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <os supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <enum name='firmware'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <loader supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>rom</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pflash</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='readonly'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>yes</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>no</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='secure'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>no</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </loader>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </os>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>on</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>off</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='maximumMigratable'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>on</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>off</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='succor'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='custom' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='KnightsMill'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='athlon'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='athlon-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='core2duo'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='core2duo-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='coreduo'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='coreduo-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='n270'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='n270-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='phenom'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='phenom-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <memoryBacking supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <enum name='sourceType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>file</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>anonymous</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>memfd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </memoryBacking>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <disk supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='diskDevice'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>disk</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>cdrom</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>floppy</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>lun</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='bus'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>fdc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>scsi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>sata</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <graphics supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vnc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>egl-headless</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dbus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <video supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='modelType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vga</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>cirrus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>none</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>bochs</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ramfb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </video>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <hostdev supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='mode'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>subsystem</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='startupPolicy'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>default</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>mandatory</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>requisite</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>optional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='subsysType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pci</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>scsi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='capsType'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='pciBackend'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </hostdev>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <rng supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>random</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>egd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>builtin</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <filesystem supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='driverType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>path</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>handle</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtiofs</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </filesystem>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <tpm supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tpm-tis</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tpm-crb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>emulator</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>external</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendVersion'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>2.0</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </tpm>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <redirdev supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='bus'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </redirdev>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <channel supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pty</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>unix</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </channel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <crypto supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>qemu</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>builtin</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </crypto>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <interface supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>default</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>passt</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <panic supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>isa</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>hyperv</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </panic>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <console supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>null</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pty</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dev</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>file</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pipe</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>stdio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>udp</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tcp</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>unix</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>qemu-vdagent</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dbus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </console>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <gic supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <genid supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <backup supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <async-teardown supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <s390-pv supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <ps2 supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <tdx supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <sev supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <sgx supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <hyperv supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='features'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>relaxed</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vapic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>spinlocks</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vpindex</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>runtime</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>synic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>stimer</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>reset</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vendor_id</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>frequencies</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>reenlightenment</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tlbflush</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ipi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>avic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>emsr_bitmap</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>xmm_input</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <defaults>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </defaults>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </hyperv>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <launchSecurity supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </features>
Jan 23 10:10:57 compute-1 nova_compute[225705]: </domainCapabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.647 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Jan 23 10:10:57 compute-1 nova_compute[225705]: <domainCapabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <domain>kvm</domain>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <arch>i686</arch>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <vcpu max='240'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <iothreads supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <os supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <enum name='firmware'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <loader supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>rom</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pflash</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='readonly'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>yes</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>no</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='secure'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>no</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </loader>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </os>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>on</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>off</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='maximumMigratable'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>on</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>off</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='succor'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='custom' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='KnightsMill'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='athlon'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='athlon-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='core2duo'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='core2duo-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='coreduo'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='coreduo-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='n270'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='n270-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='phenom'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='phenom-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <memoryBacking supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <enum name='sourceType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>file</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>anonymous</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>memfd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </memoryBacking>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <disk supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='diskDevice'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>disk</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>cdrom</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>floppy</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>lun</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='bus'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ide</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>fdc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>scsi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>sata</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <graphics supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vnc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>egl-headless</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dbus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <video supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='modelType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vga</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>cirrus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>none</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>bochs</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ramfb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </video>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <hostdev supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='mode'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>subsystem</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='startupPolicy'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>default</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>mandatory</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>requisite</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>optional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='subsysType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pci</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>scsi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='capsType'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='pciBackend'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </hostdev>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <rng supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>random</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>egd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>builtin</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <filesystem supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='driverType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>path</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>handle</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtiofs</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </filesystem>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <tpm supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tpm-tis</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tpm-crb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>emulator</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>external</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendVersion'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>2.0</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </tpm>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <redirdev supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='bus'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </redirdev>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <channel supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pty</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>unix</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </channel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <crypto supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>qemu</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>builtin</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </crypto>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <interface supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>default</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>passt</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <panic supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>isa</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>hyperv</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </panic>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <console supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>null</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pty</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dev</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>file</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pipe</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>stdio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>udp</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tcp</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>unix</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>qemu-vdagent</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dbus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </console>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <gic supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <genid supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <backup supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <async-teardown supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <s390-pv supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <ps2 supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <tdx supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <sev supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <sgx supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <hyperv supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='features'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>relaxed</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vapic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>spinlocks</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vpindex</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>runtime</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>synic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>stimer</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>reset</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vendor_id</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>frequencies</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>reenlightenment</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tlbflush</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ipi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>avic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>emsr_bitmap</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>xmm_input</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <defaults>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </defaults>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </hyperv>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <launchSecurity supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </features>
Jan 23 10:10:57 compute-1 nova_compute[225705]: </domainCapabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.698 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.704 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Jan 23 10:10:57 compute-1 nova_compute[225705]: <domainCapabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <domain>kvm</domain>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <machine>pc-q35-rhel9.8.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <arch>x86_64</arch>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <vcpu max='4096'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <iothreads supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <os supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <enum name='firmware'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>efi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <loader supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>rom</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pflash</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='readonly'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>yes</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>no</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='secure'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>yes</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>no</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </loader>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </os>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>on</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>off</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='maximumMigratable'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>on</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>off</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='succor'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='custom' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='KnightsMill'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='athlon'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='athlon-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='core2duo'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='core2duo-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='coreduo'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='coreduo-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='n270'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='n270-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='phenom'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='phenom-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <memoryBacking supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <enum name='sourceType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>file</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>anonymous</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>memfd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </memoryBacking>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <disk supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='diskDevice'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>disk</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>cdrom</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>floppy</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>lun</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='bus'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>fdc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>scsi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>sata</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <graphics supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vnc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>egl-headless</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dbus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <video supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='modelType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vga</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>cirrus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>none</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>bochs</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ramfb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </video>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <hostdev supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='mode'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>subsystem</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='startupPolicy'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>default</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>mandatory</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>requisite</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>optional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='subsysType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pci</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>scsi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='capsType'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='pciBackend'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </hostdev>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <rng supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>random</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>egd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>builtin</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <filesystem supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='driverType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>path</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>handle</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtiofs</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </filesystem>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <tpm supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tpm-tis</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tpm-crb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>emulator</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>external</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendVersion'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>2.0</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </tpm>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <redirdev supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='bus'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </redirdev>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <channel supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pty</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>unix</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </channel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <crypto supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>qemu</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>builtin</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </crypto>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <interface supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>default</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>passt</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <panic supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>isa</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>hyperv</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </panic>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <console supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>null</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pty</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dev</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>file</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pipe</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>stdio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>udp</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tcp</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>unix</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>qemu-vdagent</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dbus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </console>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <gic supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <genid supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <backup supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <async-teardown supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <s390-pv supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <ps2 supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <tdx supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <sev supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <sgx supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <hyperv supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='features'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>relaxed</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vapic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>spinlocks</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vpindex</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>runtime</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>synic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>stimer</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>reset</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vendor_id</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>frequencies</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>reenlightenment</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tlbflush</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ipi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>avic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>emsr_bitmap</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>xmm_input</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <defaults>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </defaults>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </hyperv>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <launchSecurity supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </features>
Jan 23 10:10:57 compute-1 nova_compute[225705]: </domainCapabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.793 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Jan 23 10:10:57 compute-1 nova_compute[225705]: <domainCapabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <path>/usr/libexec/qemu-kvm</path>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <domain>kvm</domain>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <machine>pc-i440fx-rhel7.6.0</machine>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <arch>x86_64</arch>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <vcpu max='240'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <iothreads supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <os supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <enum name='firmware'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <loader supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>rom</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pflash</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='readonly'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>yes</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>no</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='secure'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>no</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </loader>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </os>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='host-passthrough' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='hostPassthroughMigratable'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>on</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>off</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='maximum' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='maximumMigratable'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>on</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>off</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='host-model' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <vendor>AMD</vendor>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <maxphysaddr mode='passthrough' limit='40'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='x2apic'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc-deadline'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='hypervisor'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc_adjust'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='spec-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='stibp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='cmp_legacy'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='overflow-recov'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='succor'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='amd-ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='virt-ssbd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='lbrv'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='tsc-scale'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='vmcb-clean'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='flushbyasid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='pause-filter'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='pfthreshold'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='svme-addr-chk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <feature policy='disable' name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <mode name='custom' supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Broadwell-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cascadelake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='ClearwaterForest'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='ClearwaterForest-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ddpd-u'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sha512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm3'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sm4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Cooperlake-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Denverton-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Dhyana-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Genoa-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Milan-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Rome-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Turin'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-Turin-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amd-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='auto-ibrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vp2intersect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fs-gs-base-ns'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibpb-brtype'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='no-nested-data-bp'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='null-sel-clr-base'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='perfmon-v2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbpb'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='srso-user-kernel-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='stibp-always-on'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='EPYC-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='GraniteRapids-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-128'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-256'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx10-512'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='prefetchiti'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Haswell-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-noTSX'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v6'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Icelake-Server-v7'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='IvyBridge-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='KnightsMill'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='KnightsMill-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4fmaps'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-4vnniw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512er'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512pf'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G4-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Opteron_G5-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fma4'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tbm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xop'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SapphireRapids-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='amx-tile'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-bf16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-fp16'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512-vpopcntdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bitalg'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vbmi2'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrc'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fzrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='la57'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='taa-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='tsx-ldtrk'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='SierraForest-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ifma'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-ne-convert'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx-vnni-int8'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bhi-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='bus-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cmpccxadd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fbsdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='fsrs'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ibrs-all'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='intel-psfd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ipred-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='lam'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mcdt-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pbrsb-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='psdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rrsba-ctrl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='sbdr-ssdp-no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='serialize'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vaes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='vpclmulqdq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Client-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='hle'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='rtm'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Skylake-Server-v5'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512bw'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512cd'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512dq'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512f'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='avx512vl'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='invpcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pcid'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='pku'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='mpx'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v2'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v3'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='core-capability'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='split-lock-detect'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='Snowridge-v4'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='cldemote'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='erms'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='gfni'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdir64b'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='movdiri'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='xsaves'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='athlon'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='athlon-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='core2duo'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='core2duo-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='coreduo'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='coreduo-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='n270'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='n270-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='ss'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='phenom'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <blockers model='phenom-v1'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnow'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <feature name='3dnowext'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </blockers>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </mode>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <memoryBacking supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <enum name='sourceType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>file</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>anonymous</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <value>memfd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </memoryBacking>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <disk supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='diskDevice'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>disk</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>cdrom</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>floppy</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>lun</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='bus'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ide</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>fdc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>scsi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>sata</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <graphics supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vnc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>egl-headless</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dbus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <video supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='modelType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vga</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>cirrus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>none</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>bochs</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ramfb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </video>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <hostdev supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='mode'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>subsystem</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='startupPolicy'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>default</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>mandatory</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>requisite</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>optional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='subsysType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pci</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>scsi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='capsType'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='pciBackend'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </hostdev>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <rng supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtio-non-transitional</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>random</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>egd</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>builtin</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <filesystem supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='driverType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>path</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>handle</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>virtiofs</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </filesystem>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <tpm supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tpm-tis</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tpm-crb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>emulator</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>external</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendVersion'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>2.0</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </tpm>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <redirdev supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='bus'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>usb</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </redirdev>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <channel supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pty</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>unix</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </channel>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <crypto supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>qemu</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendModel'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>builtin</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </crypto>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <interface supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='backendType'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>default</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>passt</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <panic supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='model'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>isa</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>hyperv</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </panic>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <console supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='type'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>null</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vc</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pty</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dev</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>file</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>pipe</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>stdio</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>udp</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tcp</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>unix</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>qemu-vdagent</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>dbus</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </console>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   <features>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <gic supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <vmcoreinfo supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <genid supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <backingStoreInput supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <backup supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <async-teardown supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <s390-pv supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <ps2 supported='yes'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <tdx supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <sev supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <sgx supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <hyperv supported='yes'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <enum name='features'>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>relaxed</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vapic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>spinlocks</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vpindex</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>runtime</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>synic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>stimer</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>reset</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>vendor_id</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>frequencies</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>reenlightenment</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>tlbflush</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>ipi</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>avic</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>emsr_bitmap</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <value>xmm_input</value>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </enum>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       <defaults>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <spinlocks>4095</spinlocks>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <stimer_direct>on</stimer_direct>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <tlbflush_direct>on</tlbflush_direct>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <tlbflush_extended>on</tlbflush_extended>
Jan 23 10:10:57 compute-1 nova_compute[225705]:         <vendor_id>Linux KVM Hv</vendor_id>
Jan 23 10:10:57 compute-1 nova_compute[225705]:       </defaults>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     </hyperv>
Jan 23 10:10:57 compute-1 nova_compute[225705]:     <launchSecurity supported='no'/>
Jan 23 10:10:57 compute-1 nova_compute[225705]:   </features>
Jan 23 10:10:57 compute-1 nova_compute[225705]: </domainCapabilities>
Jan 23 10:10:57 compute-1 nova_compute[225705]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.881 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.881 225709 INFO nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Secure Boot support detected
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.883 225709 INFO nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.884 225709 INFO nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.893 225709 DEBUG nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.911 225709 INFO nova.virt.node [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Determined node identity b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from /var/lib/nova/compute_id
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.927 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Verified node b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 matches my host compute-1.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Jan 23 10:10:57 compute-1 nova_compute[225705]: 2026-01-23 10:10:57.953 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Jan 23 10:10:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:58 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800025d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.039 225709 DEBUG oslo_concurrency.lockutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.040 225709 DEBUG oslo_concurrency.lockutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.040 225709 DEBUG oslo_concurrency.lockutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.040 225709 DEBUG nova.compute.resource_tracker [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.041 225709 DEBUG oslo_concurrency.processutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:10:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:10:58 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/1853406190' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:10:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:10:58.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:10:58 compute-1 rsyslogd[1006]: imjournal from <np0005593294:nova_compute>: begin to drop messages due to rate-limiting
Jan 23 10:10:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:10:58 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1515803724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.511 225709 DEBUG oslo_concurrency.processutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:10:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:10:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:10:58.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.658 225709 WARNING nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.659 225709 DEBUG nova.compute.resource_tracker [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5215MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.659 225709 DEBUG oslo_concurrency.lockutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.659 225709 DEBUG oslo_concurrency.lockutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:10:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1853406190' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/758454433' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1515803724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:58 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800025d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.830 225709 DEBUG nova.compute.resource_tracker [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.830 225709 DEBUG nova.compute.resource_tracker [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.849 225709 DEBUG nova.scheduler.client.report [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.863 225709 DEBUG nova.scheduler.client.report [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.864 225709 DEBUG nova.compute.provider_tree [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:10:58 compute-1 nova_compute[225705]: 2026-01-23 10:10:58.934 225709 DEBUG nova.scheduler.client.report [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.002 225709 DEBUG nova.scheduler.client.report [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.037 225709 DEBUG oslo_concurrency.processutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:10:59 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:10:59 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1031836071' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.502 225709 DEBUG oslo_concurrency.processutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.508 225709 DEBUG nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Jan 23 10:10:59 compute-1 nova_compute[225705]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.509 225709 INFO nova.virt.libvirt.host [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] kernel doesn't support AMD SEV
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.510 225709 DEBUG nova.compute.provider_tree [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.510 225709 DEBUG nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:10:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:10:59 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.555 225709 DEBUG nova.scheduler.client.report [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Updated inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.556 225709 DEBUG nova.compute.provider_tree [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Updating resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.556 225709 DEBUG nova.compute.provider_tree [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.673 225709 DEBUG nova.compute.provider_tree [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Updating resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.694 225709 DEBUG nova.compute.resource_tracker [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.695 225709 DEBUG oslo_concurrency.lockutils [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.695 225709 DEBUG nova.service [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.771 225709 DEBUG nova.service [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Jan 23 10:10:59 compute-1 nova_compute[225705]: 2026-01-23 10:10:59.771 225709 DEBUG nova.servicegroup.drivers.db [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] DB_Driver: join new ServiceGroup member compute-1.ctlplane.example.com to the compute group, service = <Service: host=compute-1.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Jan 23 10:10:59 compute-1 ceph-mon[80126]: pgmap v523: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:10:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1862771379' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2360008108' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:10:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1031836071' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:00 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:00.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:00.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:00 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:11:00 compute-1 ceph-mon[80126]: pgmap v524: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:11:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:00 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:01 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800025d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:01 compute-1 sudo[226059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:11:01 compute-1 sudo[226059]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:01 compute-1 podman[226052]: 2026-01-23 10:11:01.694176695 +0000 UTC m=+0.078156742 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:11:01 compute-1 sudo[226059]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:02 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:02.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:02.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:02 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:03 compute-1 ceph-mon[80126]: pgmap v525: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 1023 B/s wr, 4 op/s
Jan 23 10:11:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:03 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101103 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:11:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:04 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800025d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:04.333 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:04.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:04 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:05 compute-1 ceph-mon[80126]: pgmap v526: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1023 B/s wr, 4 op/s
Jan 23 10:11:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:11:05 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Jan 23 10:11:05 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:05.993318) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:11:05 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Jan 23 10:11:05 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163065993598, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 1278, "num_deletes": 251, "total_data_size": 3205694, "memory_usage": 3246304, "flush_reason": "Manual Compaction"}
Jan 23 10:11:05 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Jan 23 10:11:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:06 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680028c0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163066035360, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 2091630, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19289, "largest_seqno": 20561, "table_properties": {"data_size": 2086047, "index_size": 2978, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 12014, "raw_average_key_size": 19, "raw_value_size": 2074807, "raw_average_value_size": 3435, "num_data_blocks": 132, "num_entries": 604, "num_filter_entries": 604, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769162946, "oldest_key_time": 1769162946, "file_creation_time": 1769163065, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 42154 microseconds, and 11715 cpu microseconds.
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.035539) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 2091630 bytes OK
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.035609) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.039360) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.039421) EVENT_LOG_v1 {"time_micros": 1769163066039412, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.039458) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 3199593, prev total WAL file size 3199593, number of live WAL files 2.
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.041376) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031323535' seq:72057594037927935, type:22 .. '7061786F730031353037' seq:0, type:0; will stop at (end)
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(2042KB)], [36(12MB)]
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163066041538, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 15519386, "oldest_snapshot_seqno": -1}
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 5041 keys, 13259093 bytes, temperature: kUnknown
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163066143102, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 13259093, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13223880, "index_size": 21535, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 12613, "raw_key_size": 128501, "raw_average_key_size": 25, "raw_value_size": 13130614, "raw_average_value_size": 2604, "num_data_blocks": 885, "num_entries": 5041, "num_filter_entries": 5041, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163066, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.144064) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 13259093 bytes
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.147012) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 152.8 rd, 130.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 12.8 +0.0 blob) out(12.6 +0.0 blob), read-write-amplify(13.8) write-amplify(6.3) OK, records in: 5559, records dropped: 518 output_compression: NoCompression
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.147042) EVENT_LOG_v1 {"time_micros": 1769163066147028, "job": 20, "event": "compaction_finished", "compaction_time_micros": 101542, "compaction_time_cpu_micros": 36752, "output_level": 6, "num_output_files": 1, "total_output_size": 13259093, "num_input_records": 5559, "num_output_records": 5041, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163066147810, "job": 20, "event": "table_file_deletion", "file_number": 38}
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163066150965, "job": 20, "event": "table_file_deletion", "file_number": 36}
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.040924) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.151160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.151171) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.151173) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.151176) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:11:06.151178) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:11:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:06.336 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:06.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:06 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:07 compute-1 ceph-mon[80126]: pgmap v527: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.9 KiB/s rd, 1023 B/s wr, 4 op/s
Jan 23 10:11:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:07 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:08 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:08.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:08.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:08 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:08 compute-1 sshd-session[226099]: Invalid user sol from 45.148.10.240 port 44984
Jan 23 10:11:08 compute-1 sshd-session[226099]: Connection closed by invalid user sol 45.148.10.240 port 44984 [preauth]
Jan 23 10:11:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:09 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:09 compute-1 ceph-mon[80126]: pgmap v528: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:11:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:10 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:10.342 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:10.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:10 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:11 compute-1 ceph-mon[80126]: pgmap v529: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:11:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:12 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:12.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:12.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:12 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:13 compute-1 ceph-mon[80126]: pgmap v530: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:11:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:13 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:14 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:14.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:14.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:14 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:15 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:15 compute-1 ceph-mon[80126]: pgmap v531: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:16 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:16.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:16.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:16 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80003e50 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:17 compute-1 ceph-mon[80126]: pgmap v532: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:17 compute-1 sudo[226107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:11:17 compute-1 sudo[226107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:17 compute-1 sudo[226107]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:17 compute-1 sudo[226133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:11:17 compute-1 sudo[226133]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c600016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:18 compute-1 sudo[226133]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:18 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:18.356 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:18.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:18 compute-1 podman[226190]: 2026-01-23 10:11:18.690312162 +0000 UTC m=+0.092143966 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Jan 23 10:11:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:18 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:19 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:19 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:19 compute-1 ceph-mon[80126]: pgmap v533: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:19 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:11:19 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:11:19 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:19 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:19 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:11:19 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:11:19 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:11:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:20 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:11:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:20.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:20.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:20 compute-1 nova_compute[225705]: 2026-01-23 10:11:20.773 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:20 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:20 compute-1 nova_compute[225705]: 2026-01-23 10:11:20.905 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:21 compute-1 ceph-mon[80126]: pgmap v534: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:21 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c880045d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:21 compute-1 sudo[226218]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:11:21 compute-1 sudo[226218]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:21 compute-1 sudo[226218]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:22 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:22.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:22.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:22 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:23 compute-1 ceph-mon[80126]: pgmap v535: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:11:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:23 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:24 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:24 compute-1 sudo[226244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:11:24 compute-1 sudo[226244]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:24 compute-1 sudo[226244]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:24.366 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:24.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:24 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:25 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:25 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:11:25 compute-1 ceph-mon[80126]: pgmap v536: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:25 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:26 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:26.369 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:26.679 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:26 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60001fc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:27 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:27 compute-1 ceph-mon[80126]: pgmap v537: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:28 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:28.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:28.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:28 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1318506093' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:11:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1318506093' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:11:29 compute-1 ceph-mon[80126]: pgmap v538: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3198864426' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:11:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3198864426' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:11:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:29 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:30 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:30.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:30 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1327855081' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:11:30 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1327855081' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:11:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:30.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:30 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:31 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:32 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:32 compute-1 ceph-mon[80126]: pgmap v539: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:32.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:32 compute-1 podman[226273]: 2026-01-23 10:11:32.651359856 +0000 UTC m=+0.056975750 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Jan 23 10:11:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:32.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:32 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:33 compute-1 ceph-mon[80126]: pgmap v540: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:11:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:33 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:34 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:34.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:34.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:34 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:35 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:35 compute-1 ceph-mon[80126]: pgmap v541: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:11:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:36 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:36.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:36.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:36 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:37 compute-1 ceph-mon[80126]: pgmap v542: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:37 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f30 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:38 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:38.391 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:38.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:38 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:39 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:39 compute-1 ceph-mon[80126]: pgmap v543: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:40 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:40.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:40.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:40 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:41 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80004770 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:41 compute-1 ceph-mon[80126]: pgmap v544: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:41 compute-1 sudo[226299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:11:41 compute-1 sudo[226299]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:11:41 compute-1 sudo[226299]: pam_unix(sudo:session): session closed for user root
Jan 23 10:11:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:42 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:42.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:42.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:42 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003f90 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:43 compute-1 ceph-mon[80126]: pgmap v545: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:11:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:43 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:44 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:44.397 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:44.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:44 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:45 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003fb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:45 compute-1 ceph-mon[80126]: pgmap v546: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:46 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:46.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:46.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:46 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:47 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:47 compute-1 ceph-mon[80126]: pgmap v547: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:48 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003fb0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:48.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 10:11:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:48.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 10:11:48 compute-1 ceph-mon[80126]: pgmap v548: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:48 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:49 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:49 compute-1 podman[226330]: 2026-01-23 10:11:49.690265484 +0000 UTC m=+0.096015159 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:11:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:50 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:11:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:50.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:50.717 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:50 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003fd0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:51 compute-1 ceph-mon[80126]: pgmap v549: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:51 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:52 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:52.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:52.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:52 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:53 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:53 compute-1 ceph-mon[80126]: pgmap v550: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:11:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:54 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:54.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:54.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:54 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:11:55.038 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:11:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:11:55.038 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:11:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:11:55.039 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:11:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:55 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:55 compute-1 ceph-mon[80126]: pgmap v551: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:56 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:56.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:11:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:56.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:11:56 compute-1 ceph-mon[80126]: pgmap v552: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:56 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68003ff0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:56 compute-1 nova_compute[225705]: 2026-01-23 10:11:56.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-1 nova_compute[225705]: 2026-01-23 10:11:56.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:56 compute-1 nova_compute[225705]: 2026-01-23 10:11:56.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:11:56 compute-1 nova_compute[225705]: 2026-01-23 10:11:56.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.071 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.071 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.071 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.072 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.072 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.072 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.072 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.072 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.073 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.106 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.106 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.106 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.106 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.107 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:11:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:11:57 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4280706252' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:57 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.574 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.756 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.758 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5291MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.758 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.758 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:11:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1555214160' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4280706252' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/354272967' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.894 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.894 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:11:57 compute-1 nova_compute[225705]: 2026-01-23 10:11:57.947 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:11:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:58 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:11:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:11:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:11:58.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:11:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:11:58 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3720768701' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:58 compute-1 nova_compute[225705]: 2026-01-23 10:11:58.471 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.524s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:11:58 compute-1 nova_compute[225705]: 2026-01-23 10:11:58.477 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:11:58 compute-1 nova_compute[225705]: 2026-01-23 10:11:58.562 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:11:58 compute-1 nova_compute[225705]: 2026-01-23 10:11:58.564 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:11:58 compute-1 nova_compute[225705]: 2026-01-23 10:11:58.564 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:11:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:11:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:11:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:11:58.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:11:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:58 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3890008546' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:58 compute-1 ceph-mon[80126]: pgmap v553: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:11:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3720768701' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:11:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:11:59 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:11:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4085484174' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:12:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:00 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:00.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:00.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:00 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:00 compute-1 ceph-mon[80126]: pgmap v554: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:01 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:01 compute-1 sudo[226406]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:12:01 compute-1 sudo[226406]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:01 compute-1 sudo[226406]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:02 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c002830 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:02.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:02.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:02 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:03 compute-1 ceph-mon[80126]: pgmap v555: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:12:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:03 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:03 compute-1 podman[226432]: 2026-01-23 10:12:03.656436322 +0000 UTC m=+0.058217389 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 10:12:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:04 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:04.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:04.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:04 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:05 compute-1 ceph-mon[80126]: pgmap v556: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:12:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:05 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:06 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:06.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:06.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:06 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:07 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c68004090 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:07 compute-1 ceph-mon[80126]: pgmap v557: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:08 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:08.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:08.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:08 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:09 compute-1 ceph-mon[80126]: pgmap v558: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:09 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740012f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:10 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680040b0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:10.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:10.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:10 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:11 compute-1 ceph-mon[80126]: pgmap v559: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:11 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:12 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:12.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:12.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:12 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680040d0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:13 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:14 compute-1 ceph-mon[80126]: pgmap v560: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:12:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:14 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c88004ef0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:14.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:14.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:14 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c740045f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:15 compute-1 ceph-mon[80126]: pgmap v561: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:15 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680040f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:16 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:16.434 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:16.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:16 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60001b80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:17 compute-1 ceph-mon[80126]: pgmap v562: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:17 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800035e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:18 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c680040f0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:18.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:18.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:18 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c6c004140 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:19 compute-1 ceph-mon[80126]: pgmap v563: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:19 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c60001b80 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:20 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c800035e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:20.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:12:20 compute-1 podman[226461]: 2026-01-23 10:12:20.689279577 +0000 UTC m=+0.097655371 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:12:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:20.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:20 compute-1 kernel: ganesha.nfsd[226458]: segfault at 50 ip 00007f5d14c8932e sp 00007f5c7cff8210 error 4 in libntirpc.so.5.8[7f5d14c6e000+2c000] likely on CPU 7 (core 0, socket 7)
Jan 23 10:12:20 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:12:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[211110]: 23/01/2026 10:12:20 : epoch 697348c1 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5c80002150 fd 38 proxy ignored for local
Jan 23 10:12:20 compute-1 systemd[1]: Started Process Core Dump (PID 226487/UID 0).
Jan 23 10:12:21 compute-1 ceph-mon[80126]: pgmap v564: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:22 compute-1 sudo[226490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:12:22 compute-1 sudo[226490]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:22 compute-1 sudo[226490]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:22.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:22 compute-1 systemd-coredump[226488]: Process 211134 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 66:
                                                    #0  0x00007f5d14c8932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:12:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:22 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:12:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:22.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:22 compute-1 systemd[1]: systemd-coredump@8-226487-0.service: Deactivated successfully.
Jan 23 10:12:22 compute-1 systemd[1]: systemd-coredump@8-226487-0.service: Consumed 1.934s CPU time.
Jan 23 10:12:22 compute-1 podman[226520]: 2026-01-23 10:12:22.922920601 +0000 UTC m=+0.028108054 container died f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 10:12:22 compute-1 systemd[1]: var-lib-containers-storage-overlay-1603e4fd7186137f2b1d3aaf85baad9e68881d714d19ff2b8a04631378ac54fb-merged.mount: Deactivated successfully.
Jan 23 10:12:22 compute-1 podman[226520]: 2026-01-23 10:12:22.963916362 +0000 UTC m=+0.069103835 container remove f8d1c98a3d1c54d60fdfeae9f62e6f4bd47666d00b370a24430520a7367527a6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1)
Jan 23 10:12:22 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:12:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:23 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 10:12:23 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.196s CPU time.
Jan 23 10:12:23 compute-1 ceph-mon[80126]: pgmap v565: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:12:24 compute-1 sudo[226562]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:12:24 compute-1 sudo[226562]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:24 compute-1 sudo[226562]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:24.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:24 compute-1 sudo[226587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:12:24 compute-1 sudo[226587]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:24.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:25 compute-1 sudo[226587]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:25 compute-1 ceph-mon[80126]: pgmap v566: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:26.446 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:26.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:12:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:12:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:12:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:12:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:12:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:12:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:12:27 compute-1 ceph-mon[80126]: pgmap v567: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:28.449 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:28.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101228 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:12:29 compute-1 ceph-mon[80126]: pgmap v568: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:30.452 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101230 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:12:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:30.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:31 compute-1 ceph-mon[80126]: pgmap v569: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:12:32 compute-1 sudo[226649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:12:32 compute-1 sudo[226649]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:32 compute-1 sudo[226649]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:32.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:32.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:32 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:12:32 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:12:32 compute-1 ceph-mon[80126]: pgmap v570: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:12:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:33 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 9.
Jan 23 10:12:33 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:12:33 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.196s CPU time.
Jan 23 10:12:33 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:12:33 compute-1 podman[226721]: 2026-01-23 10:12:33.597762414 +0000 UTC m=+0.043084897 container create 3d588b2d6a61f0c82320d69be6e2bc16ea6df8fb16a33be4f7b20e31fd8a1af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, io.buildah.version=1.40.1, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:12:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10803eb9a75f48fb7cb186471a2a3805dc4398d47b009ba77e7fc291e4b25cc7/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:12:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10803eb9a75f48fb7cb186471a2a3805dc4398d47b009ba77e7fc291e4b25cc7/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:12:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10803eb9a75f48fb7cb186471a2a3805dc4398d47b009ba77e7fc291e4b25cc7/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:12:33 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/10803eb9a75f48fb7cb186471a2a3805dc4398d47b009ba77e7fc291e4b25cc7/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:12:33 compute-1 podman[226721]: 2026-01-23 10:12:33.667038151 +0000 UTC m=+0.112360664 container init 3d588b2d6a61f0c82320d69be6e2bc16ea6df8fb16a33be4f7b20e31fd8a1af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:12:33 compute-1 podman[226721]: 2026-01-23 10:12:33.575730026 +0000 UTC m=+0.021052559 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:12:33 compute-1 podman[226721]: 2026-01-23 10:12:33.675605263 +0000 UTC m=+0.120927756 container start 3d588b2d6a61f0c82320d69be6e2bc16ea6df8fb16a33be4f7b20e31fd8a1af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:12:33 compute-1 bash[226721]: 3d588b2d6a61f0c82320d69be6e2bc16ea6df8fb16a33be4f7b20e31fd8a1af6
Jan 23 10:12:33 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:12:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:12:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:12:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:12:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:12:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:12:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:12:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:12:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:12:33 compute-1 podman[226743]: 2026-01-23 10:12:33.788325878 +0000 UTC m=+0.062518794 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 10:12:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:34.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:34 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3457274779' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-1 ceph-mon[80126]: from='client.24496 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1987316555' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-1 ceph-mon[80126]: from='client.14898 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-1 ceph-mon[80126]: from='client.14898 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Jan 23 10:12:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:34.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:35 compute-1 ceph-mon[80126]: pgmap v571: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 85 B/s rd, 0 op/s
Jan 23 10:12:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:12:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:36.459 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:36.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:37 compute-1 ceph-mon[80126]: pgmap v572: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 85 B/s wr, 114 op/s
Jan 23 10:12:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:38.461 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:38.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:39 compute-1 ceph-mon[80126]: pgmap v573: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 85 B/s wr, 114 op/s
Jan 23 10:12:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:40 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:12:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:40 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:12:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:40.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:40.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:41 compute-1 ceph-mon[80126]: pgmap v574: 353 pgs: 353 active+clean; 458 KiB data, 153 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 85 B/s wr, 114 op/s
Jan 23 10:12:42 compute-1 sudo[226803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:12:42 compute-1 sudo[226803]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:12:42 compute-1 sudo[226803]: pam_unix(sudo:session): session closed for user root
Jan 23 10:12:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:12:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:42.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:12:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:42.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:42 compute-1 ceph-mon[80126]: pgmap v575: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 81 KiB/s rd, 938 B/s wr, 134 op/s
Jan 23 10:12:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:44.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:44.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:45 compute-1 ceph-mon[80126]: pgmap v576: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 81 KiB/s rd, 938 B/s wr, 134 op/s
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:12:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:46.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:46.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:47 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:47 compute-1 ceph-mon[80126]: pgmap v577: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 109 KiB/s rd, 1023 B/s wr, 182 op/s
Jan 23 10:12:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:48 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c001c00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:48.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:48.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101248 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:12:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:48 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:49 : epoch 69734991 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:12:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:49 : epoch 69734991 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:12:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:49 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c000fa0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:49 compute-1 ceph-mon[80126]: pgmap v578: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 938 B/s wr, 67 op/s
Jan 23 10:12:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3459287412' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:12:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3459287412' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:12:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:50 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940000d00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:12:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:50.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:12:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:12:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:50.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:50 : epoch 69734991 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:12:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:50 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:51 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "version", "format": "json"} v 0)
Jan 23 10:12:51 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/3119665311' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:51 compute-1 podman[226849]: 2026-01-23 10:12:51.616472031 +0000 UTC m=+0.097224665 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:12:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:51 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:51 compute-1 ceph-mon[80126]: pgmap v579: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 40 KiB/s rd, 938 B/s wr, 67 op/s
Jan 23 10:12:51 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/272537903' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:51 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3119665311' entity='client.openstack' cmd=[{"prefix": "version", "format": "json"}]: dispatch
Jan 23 10:12:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:52 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:52.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:52 compute-1 ceph-mon[80126]: from='client.14940 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 23 10:12:52 compute-1 ceph-mon[80126]: from='client.24536 -' entity='client.openstack' cmd=[{"prefix": "fs volume ls", "format": "json"}]: dispatch
Jan 23 10:12:52 compute-1 ceph-mon[80126]: from='client.24536 -' entity='client.openstack' cmd=[{"prefix": "nfs cluster info", "cluster_id": "cephfs", "format": "json"}]: dispatch
Jan 23 10:12:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:52.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:52 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:53 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:53 compute-1 ceph-mon[80126]: pgmap v580: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 42 KiB/s rd, 1.7 KiB/s wr, 70 op/s
Jan 23 10:12:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:54 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:54.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101254 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:12:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:54.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:54 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:12:55.039 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:12:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:12:55.040 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:12:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:12:55.040 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:12:55 compute-1 ceph-mon[80126]: pgmap v581: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 852 B/s wr, 50 op/s
Jan 23 10:12:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:55 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:56 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:56.483 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:56.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:56 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:57 compute-1 ceph-mon[80126]: pgmap v582: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 852 B/s wr, 50 op/s
Jan 23 10:12:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:57 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c001ac0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:58 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940001820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:12:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:12:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:12:58.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.557 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.557 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.579 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.579 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.580 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.580 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:12:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:12:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:12:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:12:58.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:12:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:58 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001b20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.891 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.891 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.892 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.892 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.893 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.921 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.921 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.921 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.922 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:12:58 compute-1 nova_compute[225705]: 2026-01-23 10:12:58.922 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:12:59 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:12:59 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/190810241' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:12:59 compute-1 nova_compute[225705]: 2026-01-23 10:12:59.403 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:12:59 compute-1 nova_compute[225705]: 2026-01-23 10:12:59.554 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:12:59 compute-1 nova_compute[225705]: 2026-01-23 10:12:59.555 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5256MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:12:59 compute-1 nova_compute[225705]: 2026-01-23 10:12:59.555 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:12:59 compute-1 nova_compute[225705]: 2026-01-23 10:12:59.556 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:12:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:12:59 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:12:59 compute-1 nova_compute[225705]: 2026-01-23 10:12:59.658 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:12:59 compute-1 nova_compute[225705]: 2026-01-23 10:12:59.659 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:12:59 compute-1 nova_compute[225705]: 2026-01-23 10:12:59.691 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:12:59 compute-1 ceph-mon[80126]: pgmap v583: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 767 B/s wr, 2 op/s
Jan 23 10:12:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/811842954' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:12:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/190810241' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:00 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:00 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:13:00 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/346920199' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:00 compute-1 nova_compute[225705]: 2026-01-23 10:13:00.169 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:13:00 compute-1 nova_compute[225705]: 2026-01-23 10:13:00.175 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:13:00 compute-1 nova_compute[225705]: 2026-01-23 10:13:00.192 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:13:00 compute-1 nova_compute[225705]: 2026-01-23 10:13:00.194 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:13:00 compute-1 nova_compute[225705]: 2026-01-23 10:13:00.195 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.640s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:13:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:00.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:00.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:00 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2632395150' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/346920199' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/507459087' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:01 compute-1 ceph-mon[80126]: pgmap v584: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 767 B/s wr, 2 op/s
Jan 23 10:13:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:01 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:02 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/814723742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:02 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:02 compute-1 sudo[226925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:13:02 compute-1 sudo[226925]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:02 compute-1 sudo[226925]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:02.492 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:02.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:02 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:03 compute-1 ceph-mon[80126]: pgmap v585: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.1 KiB/s rd, 767 B/s wr, 3 op/s
Jan 23 10:13:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:03 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:04 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934002f00 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:04.494 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:04 compute-1 podman[226951]: 2026-01-23 10:13:04.653445814 +0000 UTC m=+0.052600189 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 10:13:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:04.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:04 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:05 compute-1 ceph-mon[80126]: pgmap v586: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:13:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:05 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c002f50 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:06 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940002cb0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:06.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:06.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:06 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:07 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:07 compute-1 ceph-mon[80126]: pgmap v587: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:08 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:08.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:08.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:08 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:09 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:09 compute-1 ceph-mon[80126]: pgmap v588: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:10 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:10.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:10 compute-1 ceph-mon[80126]: pgmap v589: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:10.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:10 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:11 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:12 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:13:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:12.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:13:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:12.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:12 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:13 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:13 compute-1 ceph-mon[80126]: pgmap v590: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:13:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:14 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:14.505 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:14.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:14 compute-1 ceph-mon[80126]: pgmap v591: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:14 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003c10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:15 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:16 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:16.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:16.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:16 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:17 compute-1 ceph-mon[80126]: pgmap v592: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:17 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004050 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101317 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:13:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:18 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_7] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:18.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:18.858 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:18 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:19 compute-1 ceph-mon[80126]: pgmap v593: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:19 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5938000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:20 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958001110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:13:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:20.513 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:20.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:20 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:21 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:21 compute-1 ceph-mon[80126]: pgmap v594: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:22 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59380016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:22 compute-1 sudo[226982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:13:22 compute-1 sudo[226982]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:22 compute-1 sudo[226982]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:22 compute-1 podman[227006]: 2026-01-23 10:13:22.424178755 +0000 UTC m=+0.112913291 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 10:13:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:22.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:22.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:22 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958001110 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:23 compute-1 ceph-mon[80126]: pgmap v595: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:13:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:23 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:23 compute-1 sshd-session[227037]: Invalid user sol from 45.148.10.240 port 34916
Jan 23 10:13:23 compute-1 sshd-session[227037]: Connection closed by invalid user sol 45.148.10.240 port 34916 [preauth]
Jan 23 10:13:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:24 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:24.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:24.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:24 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59380016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:25 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59580022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:25 compute-1 ceph-mon[80126]: pgmap v596: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:13:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:26 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:26.520 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:26 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:26.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:27 compute-1 ceph-mon[80126]: pgmap v597: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:13:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:27 : epoch 69734991 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:13:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:27 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59380016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:28 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59580022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:28.522 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:28 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:28.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:29 compute-1 ceph-mon[80126]: pgmap v598: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:13:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:29 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:30 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5938002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:30.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:30 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59580022a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:30.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:31 : epoch 69734991 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:13:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:31 : epoch 69734991 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:13:31 compute-1 ceph-mon[80126]: pgmap v599: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:13:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:31 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:32 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:32.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:32 compute-1 sudo[227043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:13:32 compute-1 sudo[227043]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:32 compute-1 sudo[227043]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:32 compute-1 sudo[227068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:13:32 compute-1 sudo[227068]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:32 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:32.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:33 compute-1 ceph-mon[80126]: pgmap v600: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 852 B/s wr, 3 op/s
Jan 23 10:13:33 compute-1 sudo[227068]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:34 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:34 : epoch 69734991 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:13:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:34.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:34 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:13:34 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:13:34 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:13:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:34 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5938002b10 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:34.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:35 compute-1 podman[227126]: 2026-01-23 10:13:35.664390644 +0000 UTC m=+0.061120689 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Jan 23 10:13:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:35 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:13:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:13:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:13:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:13:35 compute-1 ceph-mon[80126]: pgmap v601: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 852 B/s wr, 2 op/s
Jan 23 10:13:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:13:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:36 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:36.531 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:36 compute-1 ceph-mon[80126]: pgmap v602: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:13:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:36 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:36.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:37 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5938003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:38 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:38.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:38 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:38.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:39 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:39 compute-1 sudo[227150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:13:39 compute-1 sudo[227150]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:39 compute-1 sudo[227150]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101339 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:13:40 compute-1 ceph-mon[80126]: pgmap v603: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:13:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:13:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:40 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5938003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:40.536 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:40 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:40.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:13:41 compute-1 ceph-mon[80126]: pgmap v604: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.7 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:13:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:41 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:42 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:42 compute-1 sudo[227176]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:13:42 compute-1 sudo[227176]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:13:42 compute-1 sudo[227176]: pam_unix(sudo:session): session closed for user root
Jan 23 10:13:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:42.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:42 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5938003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:42.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:43 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:44 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:44.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:44 compute-1 ceph-mon[80126]: pgmap v605: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 1.8 KiB/s rd, 938 B/s wr, 2 op/s
Jan 23 10:13:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:44 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_11] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:44.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:45 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5938003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:45 compute-1 ceph-mon[80126]: pgmap v606: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 170 B/s wr, 0 op/s
Jan 23 10:13:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:46.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:46.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:47 compute-1 ceph-mon[80126]: pgmap v607: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 170 B/s wr, 0 op/s
Jan 23 10:13:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:47 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:48 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5938003820 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:48.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:48 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:48.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:13:49.241 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:13:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:13:49.244 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:13:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:13:49.246 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:13:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:49 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940003db0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:49 compute-1 ceph-mon[80126]: pgmap v608: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:13:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/239642082' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:13:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/239642082' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:13:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:50 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:50.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:13:50 compute-1 ceph-mon[80126]: pgmap v609: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:13:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:50 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:50.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:51 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:52 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:52.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:52 compute-1 podman[227208]: 2026-01-23 10:13:52.689188716 +0000 UTC m=+0.092182524 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Jan 23 10:13:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:52 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958003730 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:52.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:53 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:53 compute-1 ceph-mon[80126]: pgmap v610: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:13:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:54 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:54.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:54 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:54.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:13:55.040 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:13:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:13:55.041 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:13:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:13:55.041 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:13:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:55 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:56 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:56 compute-1 ceph-mon[80126]: pgmap v611: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:13:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:56.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:56 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:56.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:57 compute-1 ceph-mon[80126]: pgmap v612: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:13:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:57 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:13:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:58 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:13:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:13:58.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:13:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/774375893' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4074051618' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:13:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:58 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:13:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:13:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:13:58.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.178 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.178 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.178 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.178 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.198 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.198 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.198 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:13:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:13:59 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:13:59 compute-1 nova_compute[225705]: 2026-01-23 10:13:59.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:00 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c002ad0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:00.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:00 compute-1 nova_compute[225705]: 2026-01-23 10:14:00.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:00 compute-1 nova_compute[225705]: 2026-01-23 10:14:00.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:00 compute-1 nova_compute[225705]: 2026-01-23 10:14:00.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:00 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:00 compute-1 nova_compute[225705]: 2026-01-23 10:14:00.917 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:14:00 compute-1 nova_compute[225705]: 2026-01-23 10:14:00.917 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:14:00 compute-1 nova_compute[225705]: 2026-01-23 10:14:00.917 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:14:00 compute-1 nova_compute[225705]: 2026-01-23 10:14:00.918 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:14:00 compute-1 nova_compute[225705]: 2026-01-23 10:14:00.918 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:14:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:00.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:01 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:14:01 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2203974394' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:01 compute-1 nova_compute[225705]: 2026-01-23 10:14:01.395 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:14:01 compute-1 nova_compute[225705]: 2026-01-23 10:14:01.572 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:14:01 compute-1 nova_compute[225705]: 2026-01-23 10:14:01.573 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5253MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:14:01 compute-1 nova_compute[225705]: 2026-01-23 10:14:01.573 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:14:01 compute-1 nova_compute[225705]: 2026-01-23 10:14:01.574 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:14:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:01 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:02 compute-1 nova_compute[225705]: 2026-01-23 10:14:02.130 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:14:02 compute-1 nova_compute[225705]: 2026-01-23 10:14:02.131 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:14:02 compute-1 nova_compute[225705]: 2026-01-23 10:14:02.149 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:14:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:02 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:02 compute-1 ceph-mon[80126]: pgmap v613: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:02 compute-1 sudo[227282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:14:02 compute-1 sudo[227282]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:02 compute-1 sudo[227282]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:02.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:14:02 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/515581582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:02 compute-1 nova_compute[225705]: 2026-01-23 10:14:02.658 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:14:02 compute-1 nova_compute[225705]: 2026-01-23 10:14:02.666 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:14:02 compute-1 nova_compute[225705]: 2026-01-23 10:14:02.833 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:14:02 compute-1 nova_compute[225705]: 2026-01-23 10:14:02.835 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:14:02 compute-1 nova_compute[225705]: 2026-01-23 10:14:02.835 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:14:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:02 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c003570 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:02.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:03 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:03 compute-1 ceph-mon[80126]: pgmap v614: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2203974394' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2596105723' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:03 compute-1 ceph-mon[80126]: pgmap v615: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/515581582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3988597975' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:14:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:04 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001090 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:04.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:04 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:04.948 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:05 compute-1 ceph-mon[80126]: pgmap v616: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:05 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c003570 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:06 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:06.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:06 compute-1 podman[227311]: 2026-01-23 10:14:06.680813245 +0000 UTC m=+0.080579396 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 10:14:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:06 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:06.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:07 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:14:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:07 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:08 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c003570 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:08 compute-1 ceph-mon[80126]: pgmap v617: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:08.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:08 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:08.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:09 compute-1 ceph-mon[80126]: pgmap v618: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:09 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:10 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:10.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:10 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c003570 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:10.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:11 compute-1 ceph-mon[80126]: pgmap v619: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:11 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:12 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:12.570 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:12 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:12.962 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:13 compute-1 ceph-mon[80126]: pgmap v620: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:13 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c003570 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:14 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_15] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:14.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:14 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:14:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:14.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:14:15 compute-1 ceph-mon[80126]: pgmap v621: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:15 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:16 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:16.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:16 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964001d70 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:16.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:17 compute-1 ceph-mon[80126]: pgmap v622: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:17 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:18 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:18.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:18 compute-1 ceph-mon[80126]: pgmap v623: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:18 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:18.973 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:19 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:14:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:20 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:20.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:20 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:20.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:21 compute-1 ceph-mon[80126]: pgmap v624: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:21 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:22 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:22 compute-1 sudo[227340]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:14:22 compute-1 sudo[227340]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:22 compute-1 sudo[227340]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:22.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:22 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:22.979 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:23 compute-1 podman[227366]: 2026-01-23 10:14:23.697678169 +0000 UTC m=+0.099657062 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:14:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:23 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:24 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:24 compute-1 ceph-mon[80126]: pgmap v625: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:24.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:24 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:24.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:25 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:26 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:14:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:26.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:14:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:26 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:26.986 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:27 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:28 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:28.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:28 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934003a20 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:28.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:29 compute-1 ceph-mon[80126]: pgmap v626: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:29 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964000df0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:30 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:30.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:30 compute-1 ceph-mon[80126]: pgmap v627: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:30 compute-1 ceph-mon[80126]: pgmap v628: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:30 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:30.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:31 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:32 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:32.594 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:32 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:32.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:33 compute-1 ceph-mon[80126]: pgmap v629: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:34 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:34 compute-1 ceph-mon[80126]: pgmap v630: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:34.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:34 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:35.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:35 compute-1 ceph-mon[80126]: pgmap v631: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:14:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:35 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:36 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:36.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:36 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5964009990 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:37.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:37 compute-1 ceph-mon[80126]: pgmap v632: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:37 compute-1 podman[227401]: 2026-01-23 10:14:37.648652768 +0000 UTC m=+0.052025924 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Jan 23 10:14:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:37 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:38 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:38.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:38 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:39.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:39 compute-1 ceph-mon[80126]: pgmap v633: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:39 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:39 compute-1 sudo[227421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:14:39 compute-1 sudo[227421]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:39 compute-1 sudo[227421]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:40 compute-1 sudo[227446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:14:40 compute-1 sudo[227446]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:40 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:40 compute-1 sudo[227446]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:40.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:40 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 10:14:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 10:14:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:41.010 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:41 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:41 compute-1 ceph-mon[80126]: pgmap v634: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:42 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:42 compute-1 sudo[227503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:14:42 compute-1 sudo[227503]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:42 compute-1 sudo[227503]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:42.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:42 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:43.013 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:43 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:43 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:43 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:43 compute-1 ceph-mon[80126]: pgmap v635: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:43 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 10:14:43 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:14:43 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:14:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:44 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:44.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:44 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:44 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:44 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:14:44 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:14:44 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:14:44 compute-1 ceph-mon[80126]: pgmap v636: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:44 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:45.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:45 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c002910 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:46.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:47.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.219145) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287219198, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2372, "num_deletes": 251, "total_data_size": 6404427, "memory_usage": 6495224, "flush_reason": "Manual Compaction"}
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Jan 23 10:14:47 compute-1 ceph-mon[80126]: pgmap v637: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287263057, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 4161547, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 20566, "largest_seqno": 22933, "table_properties": {"data_size": 4151860, "index_size": 6117, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19752, "raw_average_key_size": 20, "raw_value_size": 4132530, "raw_average_value_size": 4225, "num_data_blocks": 268, "num_entries": 978, "num_filter_entries": 978, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163067, "oldest_key_time": 1769163067, "file_creation_time": 1769163287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 43995 microseconds, and 10535 cpu microseconds.
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.263132) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 4161547 bytes OK
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.263165) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.265599) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.265671) EVENT_LOG_v1 {"time_micros": 1769163287265660, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.265695) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 6393860, prev total WAL file size 6393860, number of live WAL files 2.
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.267297) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031353036' seq:72057594037927935, type:22 .. '7061786F730031373538' seq:0, type:0; will stop at (end)
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(4064KB)], [39(12MB)]
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287267345, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 17420640, "oldest_snapshot_seqno": -1}
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 5501 keys, 15177368 bytes, temperature: kUnknown
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287547656, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 15177368, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15137702, "index_size": 24836, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13765, "raw_key_size": 138633, "raw_average_key_size": 25, "raw_value_size": 15035059, "raw_average_value_size": 2733, "num_data_blocks": 1026, "num_entries": 5501, "num_filter_entries": 5501, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.547935) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 15177368 bytes
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.550562) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 62.1 rd, 54.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.6 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(7.8) write-amplify(3.6) OK, records in: 6019, records dropped: 518 output_compression: NoCompression
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.550587) EVENT_LOG_v1 {"time_micros": 1769163287550576, "job": 22, "event": "compaction_finished", "compaction_time_micros": 280399, "compaction_time_cpu_micros": 31308, "output_level": 6, "num_output_files": 1, "total_output_size": 15177368, "num_input_records": 6019, "num_output_records": 5501, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287551573, "job": 22, "event": "table_file_deletion", "file_number": 41}
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163287553787, "job": 22, "event": "table_file_deletion", "file_number": 39}
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.267134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.553977) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.553985) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.553988) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.553989) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:14:47.553991) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:14:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:47 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:48 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:48.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:48 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:49.022 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:49 compute-1 sudo[227534]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:14:49 compute-1 sudo[227534]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:14:49 compute-1 sudo[227534]: pam_unix(sudo:session): session closed for user root
Jan 23 10:14:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:49 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:49 compute-1 ceph-mon[80126]: pgmap v638: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3340243466' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:14:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3340243466' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:14:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:14:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:50 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:50.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:14:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:50 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:51.025 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:51 compute-1 ceph-mon[80126]: pgmap v639: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:51 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:52 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:52.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:52 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:53.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:53 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:53 compute-1 ceph-mon[80126]: pgmap v640: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:14:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:54 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:54.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:54 compute-1 podman[227561]: 2026-01-23 10:14:54.689664509 +0000 UTC m=+0.100755104 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:14:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:54 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:55 compute-1 ceph-mon[80126]: pgmap v641: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:14:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:55.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:14:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:14:55.042 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:14:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:14:55.043 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:14:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:14:55.043 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:14:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:55 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934004340 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:56 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:56.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:56 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:57.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:57 compute-1 ceph-mon[80126]: pgmap v642: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:14:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:57 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:58 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:14:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:14:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:14:58.629 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:14:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:58 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:14:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:14:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:14:59.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:14:59 compute-1 ceph-mon[80126]: pgmap v643: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:14:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:14:59 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:14:59 compute-1 nova_compute[225705]: 2026-01-23 10:14:59.830 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:14:59 compute-1 nova_compute[225705]: 2026-01-23 10:14:59.831 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.031 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.032 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.032 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.137 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.137 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.137 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.138 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.138 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:15:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:00 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/315230113' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:00.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:00 compute-1 nova_compute[225705]: 2026-01-23 10:15:00.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:00 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940001230 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:01.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.181393) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301181434, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 398, "num_deletes": 250, "total_data_size": 509751, "memory_usage": 518304, "flush_reason": "Manual Compaction"}
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301185982, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 319310, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22938, "largest_seqno": 23331, "table_properties": {"data_size": 316996, "index_size": 478, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 6033, "raw_average_key_size": 19, "raw_value_size": 312329, "raw_average_value_size": 1017, "num_data_blocks": 20, "num_entries": 307, "num_filter_entries": 307, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163288, "oldest_key_time": 1769163288, "file_creation_time": 1769163301, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 4630 microseconds, and 1679 cpu microseconds.
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.186022) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 319310 bytes OK
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.186041) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.190080) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.190118) EVENT_LOG_v1 {"time_micros": 1769163301190107, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.190145) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 507171, prev total WAL file size 507171, number of live WAL files 2.
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.190788) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D67727374617400353033' seq:72057594037927935, type:22 .. '6D67727374617400373534' seq:0, type:0; will stop at (end)
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(311KB)], [42(14MB)]
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301190832, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 15496678, "oldest_snapshot_seqno": -1}
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 5298 keys, 11393880 bytes, temperature: kUnknown
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301331858, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 11393880, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11360053, "index_size": 19509, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13253, "raw_key_size": 134832, "raw_average_key_size": 25, "raw_value_size": 11265329, "raw_average_value_size": 2126, "num_data_blocks": 794, "num_entries": 5298, "num_filter_entries": 5298, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163301, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.332242) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 11393880 bytes
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.446389) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 109.8 rd, 80.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 14.5 +0.0 blob) out(10.9 +0.0 blob), read-write-amplify(84.2) write-amplify(35.7) OK, records in: 5808, records dropped: 510 output_compression: NoCompression
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.446434) EVENT_LOG_v1 {"time_micros": 1769163301446419, "job": 24, "event": "compaction_finished", "compaction_time_micros": 141142, "compaction_time_cpu_micros": 35645, "output_level": 6, "num_output_files": 1, "total_output_size": 11393880, "num_input_records": 5808, "num_output_records": 5298, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301446743, "job": 24, "event": "table_file_deletion", "file_number": 44}
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163301449545, "job": 24, "event": "table_file_deletion", "file_number": 42}
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.190696) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.449618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.449624) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.449626) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.449628) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:15:01.449630) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:15:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:01 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940002140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:01 compute-1 ceph-mon[80126]: pgmap v644: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2458334425' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:01 compute-1 nova_compute[225705]: 2026-01-23 10:15:01.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.230 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.230 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.231 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.231 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.231 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:15:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:02 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:02.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:15:02 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/817666258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.702 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:15:02 compute-1 sudo[227613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:15:02 compute-1 sudo[227613]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:02 compute-1 sudo[227613]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:02 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/817666258' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.886 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.887 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5262MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.887 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:15:02 compute-1 nova_compute[225705]: 2026-01-23 10:15:02.888 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:15:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:02 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c0014d0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:03 compute-1 nova_compute[225705]: 2026-01-23 10:15:03.025 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:15:03 compute-1 nova_compute[225705]: 2026-01-23 10:15:03.026 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:15:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:03.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:03 compute-1 nova_compute[225705]: 2026-01-23 10:15:03.048 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:15:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:15:03 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3691473830' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:03 compute-1 nova_compute[225705]: 2026-01-23 10:15:03.507 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:15:03 compute-1 nova_compute[225705]: 2026-01-23 10:15:03.511 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:15:03 compute-1 nova_compute[225705]: 2026-01-23 10:15:03.547 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:15:03 compute-1 nova_compute[225705]: 2026-01-23 10:15:03.549 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:15:03 compute-1 nova_compute[225705]: 2026-01-23 10:15:03.549 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.662s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:15:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:03 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:03 compute-1 ceph-mon[80126]: pgmap v645: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:15:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4283179534' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3691473830' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3138463557' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:04 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940002140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:04.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:04 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:04 compute-1 ceph-mon[80126]: pgmap v646: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:05.050 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:05 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:15:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:06 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:06.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:06 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5940002140 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:07.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:07 compute-1 ceph-mon[80126]: pgmap v647: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:15:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:07 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:08 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:08.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:08 compute-1 podman[227665]: 2026-01-23 10:15:08.651593379 +0000 UTC m=+0.056048442 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Jan 23 10:15:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:08 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:09.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:09 compute-1 ceph-mon[80126]: pgmap v648: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:09 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59400039c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:10 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:10.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:10 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:11.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:11 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:11 compute-1 ceph-mon[80126]: pgmap v649: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:12 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59400039c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:12.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:12 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:13.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:13 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:14 compute-1 ceph-mon[80126]: pgmap v650: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:15:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:14 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:14.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:14 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59400039c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:15.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:15 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:15 compute-1 ceph-mon[80126]: pgmap v651: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:16 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:16.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:16 compute-1 ceph-mon[80126]: pgmap v652: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:15:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:16 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:17.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:17 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59400039c0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:18 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:18.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:18 compute-1 ceph-mon[80126]: pgmap v653: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:18 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:19.074 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:19 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:20 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:15:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:20.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:20 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:21.077 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:21 compute-1 ceph-mon[80126]: pgmap v654: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:21 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:22 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:22.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:22 compute-1 sudo[227693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:15:22 compute-1 sudo[227693]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:22 compute-1 sudo[227693]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:22 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c001080 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:23.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:23 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:23 compute-1 ceph-mon[80126]: pgmap v655: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:15:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:24 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:24.657 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:24 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:25.085 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:25 compute-1 podman[227720]: 2026-01-23 10:15:25.697158604 +0000 UTC m=+0.099980681 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Jan 23 10:15:25 compute-1 ceph-mon[80126]: pgmap v656: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:25 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c0028a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:26 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:26.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:26 compute-1 ceph-mon[80126]: pgmap v657: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:15:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:26 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:27.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:27 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:28 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c0028a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:28.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:28 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:29.092 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:29 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:29 compute-1 ceph-mon[80126]: pgmap v658: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:30 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:30.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:30 compute-1 ceph-mon[80126]: pgmap v659: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:30 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c0028a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:31.095 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:31 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:32 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:32.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:32 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:33.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:33 compute-1 ceph-mon[80126]: pgmap v660: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:15:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:33 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c003d30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 10:15:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - - [23/Jan/2026:10:15:34.077 +0000] "GET /swift/info HTTP/1.1" 200 539 - "python-urllib3/1.26.5" - latency=0.002000064s
Jan 23 10:15:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:34 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:34.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:34 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:34 compute-1 ceph-mon[80126]: pgmap v661: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:35.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:35 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004440 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:15:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:36 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c003d30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:36.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:36 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:37 compute-1 ceph-mon[80126]: pgmap v662: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:15:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:37.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:37 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e139 e139: 3 total, 3 up, 3 in
Jan 23 10:15:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:38 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e139 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:38.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:38 compute-1 sshd-session[227754]: Invalid user sol from 45.148.10.240 port 56700
Jan 23 10:15:38 compute-1 podman[227756]: 2026-01-23 10:15:38.826820295 +0000 UTC m=+0.049190436 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:15:38 compute-1 sshd-session[227754]: Connection closed by invalid user sol 45.148.10.240 port 56700 [preauth]
Jan 23 10:15:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:38 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c003d30 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:39.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:39 compute-1 ceph-mon[80126]: osdmap e139: 3 total, 3 up, 3 in
Jan 23 10:15:39 compute-1 ceph-mon[80126]: pgmap v664: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 307 B/s rd, 0 op/s
Jan 23 10:15:39 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e140 e140: 3 total, 3 up, 3 in
Jan 23 10:15:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:39 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:40 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:40.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:40 compute-1 ceph-mon[80126]: osdmap e140: 3 total, 3 up, 3 in
Jan 23 10:15:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e141 e141: 3 total, 3 up, 3 in
Jan 23 10:15:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:40 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:41.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:41 compute-1 ceph-mon[80126]: pgmap v666: 353 pgs: 353 active+clean; 458 KiB data, 157 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:41 compute-1 ceph-mon[80126]: osdmap e141: 3 total, 3 up, 3 in
Jan 23 10:15:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:41 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f595c004a40 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:42 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_21] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:42.676 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e142 e142: 3 total, 3 up, 3 in
Jan 23 10:15:42 compute-1 sudo[227777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:15:42 compute-1 sudo[227777]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:42 compute-1 sudo[227777]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:42 compute-1 ceph-mon[80126]: pgmap v668: 353 pgs: 353 active+clean; 21 MiB data, 178 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 3.4 MiB/s wr, 31 op/s
Jan 23 10:15:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:42 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f593c004530 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:43.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e142 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:45 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59580044e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:45 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004a10 fd 49 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:45 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f596400a2b0 fd 50 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:15:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:45.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:45 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:45.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:45 compute-1 ceph-mon[80126]: osdmap e142: 3 total, 3 up, 3 in
Jan 23 10:15:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:45 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59580044e0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:46 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e143 e143: 3 total, 3 up, 3 in
Jan 23 10:15:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f592c000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:46 compute-1 ceph-mon[80126]: pgmap v670: 353 pgs: 353 active+clean; 21 MiB data, 178 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 3.4 MiB/s wr, 31 op/s
Jan 23 10:15:46 compute-1 ceph-mon[80126]: osdmap e143: 3 total, 3 up, 3 in
Jan 23 10:15:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:46 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5930000b60 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:15:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:47.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:47 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:47.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:47 compute-1 ceph-mon[80126]: pgmap v672: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 44 KiB/s rd, 6.8 MiB/s wr, 64 op/s
Jan 23 10:15:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:47 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:48 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004500 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:48 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f592c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3735003259' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:15:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3735003259' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:15:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:49.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:49.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:49 compute-1 sudo[227812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:15:49 compute-1 sudo[227812]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:49 compute-1 sudo[227812]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:49 compute-1 sudo[227837]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 10:15:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:49 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59300016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:49 compute-1 sudo[227837]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:50 compute-1 ceph-mon[80126]: pgmap v673: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 5.3 MiB/s wr, 49 op/s
Jan 23 10:15:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:50 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:50 compute-1 podman[227934]: 2026-01-23 10:15:50.385549532 +0000 UTC m=+0.070349604 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:15:50 compute-1 podman[227934]: 2026-01-23 10:15:50.514217548 +0000 UTC m=+0.199017590 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.build-date=20250325)
Jan 23 10:15:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:50 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004520 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:50 compute-1 podman[228053]: 2026-01-23 10:15:50.9949681 +0000 UTC m=+0.068114354 container exec 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:15:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:15:51 compute-1 ceph-mon[80126]: pgmap v674: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 2.6 MiB/s wr, 24 op/s
Jan 23 10:15:51 compute-1 podman[228053]: 2026-01-23 10:15:51.037807734 +0000 UTC m=+0.110953968 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:15:51 compute-1 podman[228145]: 2026-01-23 10:15:51.368780053 +0000 UTC m=+0.061855325 container exec 3d588b2d6a61f0c82320d69be6e2bc16ea6df8fb16a33be4f7b20e31fd8a1af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:15:51 compute-1 podman[228145]: 2026-01-23 10:15:51.378893993 +0000 UTC m=+0.071969235 container exec_died 3d588b2d6a61f0c82320d69be6e2bc16ea6df8fb16a33be4f7b20e31fd8a1af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True)
Jan 23 10:15:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:51.544 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:51.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:51 compute-1 podman[228208]: 2026-01-23 10:15:51.796417658 +0000 UTC m=+0.256649802 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 10:15:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:51 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f592c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:52 compute-1 podman[228208]: 2026-01-23 10:15:52.019906909 +0000 UTC m=+0.480139023 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 10:15:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:52 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59300016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:52 compute-1 podman[228274]: 2026-01-23 10:15:52.756138885 +0000 UTC m=+0.184700137 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, version=2.2.4, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.buildah.version=1.28.2, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, vcs-type=git, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=keepalived-container)
Jan 23 10:15:52 compute-1 podman[228295]: 2026-01-23 10:15:52.824809236 +0000 UTC m=+0.050079285 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, io.k8s.display-name=Keepalived on RHEL 9, io.openshift.tags=Ceph keepalived, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, distribution-scope=public, io.buildah.version=1.28.2, architecture=x86_64, release=1793, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, build-date=2023-02-22T09:23:20, io.openshift.expose-services=, version=2.2.4, name=keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=keepalived for Ceph)
Jan 23 10:15:52 compute-1 podman[228274]: 2026-01-23 10:15:52.831933451 +0000 UTC m=+0.260494713 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, io.buildah.version=1.28.2, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, description=keepalived for Ceph, version=2.2.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=keepalived, distribution-scope=public, io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, io.k8s.display-name=Keepalived on RHEL 9, vcs-type=git, release=1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., com.redhat.component=keepalived-container, io.openshift.expose-services=)
Jan 23 10:15:52 compute-1 sudo[227837]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:52 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:52 compute-1 sudo[228309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:15:52 compute-1 sudo[228309]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:52 compute-1 sudo[228309]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:53 compute-1 sudo[228334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:15:53 compute-1 sudo[228334]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:53.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:53.547 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:53 compute-1 sudo[228334]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:53 compute-1 ceph-mon[80126]: pgmap v675: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 2.1 MiB/s wr, 20 op/s
Jan 23 10:15:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:53 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004540 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:54 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f592c0016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:54 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f59300016a0 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:15:55.043 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:15:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:15:55.043 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:15:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:15:55.044 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:15:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:15:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:15:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:15:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:15:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:15:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:15:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:55.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:15:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:55.550 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:55 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5934001680 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:56 compute-1 ceph-mon[80126]: pgmap v676: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Jan 23 10:15:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:56 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f5958004560 fd 39 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:15:56 compute-1 podman[228392]: 2026-01-23 10:15:56.672727354 +0000 UTC m=+0.078307136 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 10:15:56 compute-1 nova_compute[225705]: 2026-01-23 10:15:56.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:56 compute-1 nova_compute[225705]: 2026-01-23 10:15:56.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:15:56 compute-1 nova_compute[225705]: 2026-01-23 10:15:56.895 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:15:56 compute-1 nova_compute[225705]: 2026-01-23 10:15:56.896 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:56 compute-1 nova_compute[225705]: 2026-01-23 10:15:56.896 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:15:56 compute-1 nova_compute[225705]: 2026-01-23 10:15:56.908 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:56 compute-1 kernel: ganesha.nfsd[227809]: segfault at 50 ip 00007f59e767b32e sp 00007f5948ff8210 error 4 in libntirpc.so.5.8[7f59e7660000+2c000] likely on CPU 0 (core 0, socket 0)
Jan 23 10:15:56 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:15:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[226736]: 23/01/2026 10:15:56 : epoch 69734991 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7f592c002b10 fd 39 proxy ignored for local
Jan 23 10:15:57 compute-1 systemd[1]: Started Process Core Dump (PID 228416/UID 0).
Jan 23 10:15:57 compute-1 ceph-mon[80126]: pgmap v677: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 18 op/s
Jan 23 10:15:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:57.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:57.553 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:58 compute-1 systemd-coredump[228417]: Process 226742 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 71:
                                                    #0  0x00007f59e767b32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    #1  0x0000000000000000 n/a (n/a + 0x0)
                                                    #2  0x00007f59e7685900 n/a (/usr/lib64/libntirpc.so.5.8 + 0x2c900)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:15:58 compute-1 systemd[1]: systemd-coredump@9-228416-0.service: Deactivated successfully.
Jan 23 10:15:58 compute-1 systemd[1]: systemd-coredump@9-228416-0.service: Consumed 1.141s CPU time.
Jan 23 10:15:58 compute-1 podman[228423]: 2026-01-23 10:15:58.442948764 +0000 UTC m=+0.025878599 container died 3d588b2d6a61f0c82320d69be6e2bc16ea6df8fb16a33be4f7b20e31fd8a1af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1)
Jan 23 10:15:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:15:58 compute-1 systemd[1]: var-lib-containers-storage-overlay-10803eb9a75f48fb7cb186471a2a3805dc4398d47b009ba77e7fc291e4b25cc7-merged.mount: Deactivated successfully.
Jan 23 10:15:58 compute-1 podman[228423]: 2026-01-23 10:15:58.645418892 +0000 UTC m=+0.228348627 container remove 3d588b2d6a61f0c82320d69be6e2bc16ea6df8fb16a33be4f7b20e31fd8a1af6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2)
Jan 23 10:15:58 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:15:58 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 10:15:58 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.730s CPU time.
Jan 23 10:15:58 compute-1 nova_compute[225705]: 2026-01-23 10:15:58.915 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:58 compute-1 nova_compute[225705]: 2026-01-23 10:15:58.916 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:15:58 compute-1 nova_compute[225705]: 2026-01-23 10:15:58.916 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:15:58 compute-1 nova_compute[225705]: 2026-01-23 10:15:58.916 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:15:58 compute-1 nova_compute[225705]: 2026-01-23 10:15:58.930 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:15:59 compute-1 sudo[228470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:15:59 compute-1 sudo[228470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:15:59 compute-1 sudo[228470]: pam_unix(sudo:session): session closed for user root
Jan 23 10:15:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:15:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:15:59.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:15:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:15:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:15:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:15:59.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:15:59 compute-1 ceph-mon[80126]: pgmap v678: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:15:59 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:59 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:15:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3650861742' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:15:59 compute-1 nova_compute[225705]: 2026-01-23 10:15:59.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3003629658' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:00 compute-1 nova_compute[225705]: 2026-01-23 10:16:00.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:00 compute-1 nova_compute[225705]: 2026-01-23 10:16:00.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:00 compute-1 nova_compute[225705]: 2026-01-23 10:16:00.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:16:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:16:00.988 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:16:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:16:00.990 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:16:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:01.558 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:01.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:01 compute-1 nova_compute[225705]: 2026-01-23 10:16:01.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:01 compute-1 nova_compute[225705]: 2026-01-23 10:16:01.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:01 compute-1 ceph-mon[80126]: pgmap v679: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:16:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/472721166' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:01 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:16:01.993 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:16:02 compute-1 nova_compute[225705]: 2026-01-23 10:16:02.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:02 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4013248704' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:02 compute-1 ceph-mon[80126]: pgmap v680: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:16:02 compute-1 sudo[228497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:16:02 compute-1 sudo[228497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:02 compute-1 sudo[228497]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101602 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:16:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:03.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:03.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:03 compute-1 nova_compute[225705]: 2026-01-23 10:16:03.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:16:03 compute-1 nova_compute[225705]: 2026-01-23 10:16:03.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:16:03 compute-1 nova_compute[225705]: 2026-01-23 10:16:03.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:16:03 compute-1 nova_compute[225705]: 2026-01-23 10:16:03.898 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:16:03 compute-1 nova_compute[225705]: 2026-01-23 10:16:03.898 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:16:03 compute-1 nova_compute[225705]: 2026-01-23 10:16:03.898 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:16:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:16:04 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/455437505' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.336 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.499 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.500 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5265MB free_disk=59.98828125GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.501 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.501 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:16:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/455437505' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.620 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.621 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.699 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.766 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.766 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.789 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.811 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:16:04 compute-1 nova_compute[225705]: 2026-01-23 10:16:04.827 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:16:05 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:16:05 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3264499221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:05 compute-1 nova_compute[225705]: 2026-01-23 10:16:05.307 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:16:05 compute-1 nova_compute[225705]: 2026-01-23 10:16:05.312 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:16:05 compute-1 nova_compute[225705]: 2026-01-23 10:16:05.347 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:16:05 compute-1 nova_compute[225705]: 2026-01-23 10:16:05.348 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:16:05 compute-1 nova_compute[225705]: 2026-01-23 10:16:05.348 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.847s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:16:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:05.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:05 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:05.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:05 compute-1 ceph-mon[80126]: pgmap v681: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:16:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:16:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3264499221' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:07.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:07.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:07 compute-1 ceph-mon[80126]: pgmap v682: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:16:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:09 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 10.
Jan 23 10:16:09 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:16:09 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.730s CPU time.
Jan 23 10:16:09 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:16:09 compute-1 podman[228570]: 2026-01-23 10:16:09.189385362 +0000 UTC m=+0.081950921 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:16:09 compute-1 podman[228631]: 2026-01-23 10:16:09.310748048 +0000 UTC m=+0.046965165 container create 0b872f0e8bc76b7d6eee9eacb5ff8971176a2ec887fbe703344a6669530bce26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 10:16:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bed27c64872bf8529345969a4191e92df6ccf355f83eb481cb6d4f93143e1094/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:16:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bed27c64872bf8529345969a4191e92df6ccf355f83eb481cb6d4f93143e1094/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:16:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bed27c64872bf8529345969a4191e92df6ccf355f83eb481cb6d4f93143e1094/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:16:09 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bed27c64872bf8529345969a4191e92df6ccf355f83eb481cb6d4f93143e1094/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:16:09 compute-1 podman[228631]: 2026-01-23 10:16:09.36494875 +0000 UTC m=+0.101165887 container init 0b872f0e8bc76b7d6eee9eacb5ff8971176a2ec887fbe703344a6669530bce26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, CEPH_REF=squid, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:16:09 compute-1 podman[228631]: 2026-01-23 10:16:09.374319196 +0000 UTC m=+0.110536313 container start 0b872f0e8bc76b7d6eee9eacb5ff8971176a2ec887fbe703344a6669530bce26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS)
Jan 23 10:16:09 compute-1 bash[228631]: 0b872f0e8bc76b7d6eee9eacb5ff8971176a2ec887fbe703344a6669530bce26
Jan 23 10:16:09 compute-1 podman[228631]: 2026-01-23 10:16:09.29089778 +0000 UTC m=+0.027114917 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:16:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:16:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:16:09 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:16:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:16:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:16:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:16:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:16:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:16:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:09.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:09.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:16:09 compute-1 ceph-mon[80126]: pgmap v683: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:16:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:11.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:11.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:11 compute-1 ceph-mon[80126]: pgmap v684: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:16:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101612 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:16:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:13.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:13.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:13 compute-1 ceph-mon[80126]: pgmap v685: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:16:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:15 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:15.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:15.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:16:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:16:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:16:15 compute-1 ceph-mon[80126]: pgmap v686: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:16:16 compute-1 ceph-mon[80126]: pgmap v687: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:16:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:17 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:17.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:17.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:16:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:16:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:16:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:19.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:19 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:19.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:19 compute-1 ceph-mon[80126]: pgmap v688: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:16:19 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/420910609' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:16:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101620 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:16:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [ALERT] 022/101620 (4) : backend 'backend' has no server available!
Jan 23 10:16:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:16:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:21 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:21.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:21.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e144 e144: 3 total, 3 up, 3 in
Jan 23 10:16:21 compute-1 ceph-mon[80126]: pgmap v689: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:16:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e145 e145: 3 total, 3 up, 3 in
Jan 23 10:16:23 compute-1 sudo[228695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:16:23 compute-1 sudo[228695]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:23 compute-1 sudo[228695]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:23 compute-1 ceph-mon[80126]: osdmap e144: 3 total, 3 up, 3 in
Jan 23 10:16:23 compute-1 ceph-mon[80126]: pgmap v691: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 716 B/s wr, 10 op/s
Jan 23 10:16:23 compute-1 ceph-mon[80126]: osdmap e145: 3 total, 3 up, 3 in
Jan 23 10:16:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:23.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:23 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:23.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:24 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1399276032' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:16:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:25.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:25 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:25.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf10000df0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:25 compute-1 ceph-mon[80126]: pgmap v693: 353 pgs: 353 active+clean; 41 MiB data, 198 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 383 B/s wr, 11 op/s
Jan 23 10:16:25 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1942849923' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:16:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:16:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040016c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:27 compute-1 ceph-mon[80126]: pgmap v694: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 54 op/s
Jan 23 10:16:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:27.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:27.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:27 compute-1 podman[228738]: 2026-01-23 10:16:27.686646985 +0000 UTC m=+0.091562404 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:16:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14001ac0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101628 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 1 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:16:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:16:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:16:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:29.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:29.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:29 compute-1 ceph-mon[80126]: pgmap v695: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 54 op/s
Jan 23 10:16:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf140025c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:31 compute-1 ceph-mon[80126]: pgmap v696: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 2.4 MiB/s wr, 39 op/s
Jan 23 10:16:31 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 e146: 3 total, 3 up, 3 in
Jan 23 10:16:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:31.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:31.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:16:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:32 compute-1 ceph-mon[80126]: osdmap e146: 3 total, 3 up, 3 in
Jan 23 10:16:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:16:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf140025c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:33.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:33.620 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:34 compute-1 ceph-mon[80126]: pgmap v698: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.4 MiB/s rd, 2.2 MiB/s wr, 128 op/s
Jan 23 10:16:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101635 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 2 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:16:35 compute-1 ceph-mon[80126]: pgmap v699: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 124 op/s
Jan 23 10:16:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:16:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:16:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:16:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:35.614 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:35.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf140032d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.215191) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396215281, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1295, "num_deletes": 260, "total_data_size": 2949529, "memory_usage": 2987360, "flush_reason": "Manual Compaction"}
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396227966, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 1940856, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 23336, "largest_seqno": 24626, "table_properties": {"data_size": 1935237, "index_size": 2950, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12062, "raw_average_key_size": 19, "raw_value_size": 1923624, "raw_average_value_size": 3077, "num_data_blocks": 130, "num_entries": 625, "num_filter_entries": 625, "num_deletions": 260, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163301, "oldest_key_time": 1769163301, "file_creation_time": 1769163396, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 12835 microseconds, and 5124 cpu microseconds.
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.228040) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 1940856 bytes OK
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.228075) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.229870) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.229895) EVENT_LOG_v1 {"time_micros": 1769163396229888, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.229921) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 2943288, prev total WAL file size 2943288, number of live WAL files 2.
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.231168) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00323535' seq:72057594037927935, type:22 .. '6C6F676D00353131' seq:0, type:0; will stop at (end)
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(1895KB)], [45(10MB)]
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396231249, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 13334736, "oldest_snapshot_seqno": -1}
Jan 23 10:16:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 5383 keys, 13143103 bytes, temperature: kUnknown
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396334057, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 13143103, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13106721, "index_size": 21808, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13509, "raw_key_size": 137833, "raw_average_key_size": 25, "raw_value_size": 13008535, "raw_average_value_size": 2416, "num_data_blocks": 888, "num_entries": 5383, "num_filter_entries": 5383, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163396, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.334394) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 13143103 bytes
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.336163) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 129.6 rd, 127.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 10.9 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(13.6) write-amplify(6.8) OK, records in: 5923, records dropped: 540 output_compression: NoCompression
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.336185) EVENT_LOG_v1 {"time_micros": 1769163396336175, "job": 26, "event": "compaction_finished", "compaction_time_micros": 102916, "compaction_time_cpu_micros": 29271, "output_level": 6, "num_output_files": 1, "total_output_size": 13143103, "num_input_records": 5923, "num_output_records": 5383, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396336939, "job": 26, "event": "table_file_deletion", "file_number": 47}
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163396340602, "job": 26, "event": "table_file_deletion", "file_number": 45}
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.231030) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.340746) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.340754) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.340757) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.340759) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:36 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:16:36.340761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:16:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:37 compute-1 ceph-mon[80126]: pgmap v700: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 16 KiB/s wr, 92 op/s
Jan 23 10:16:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:37.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:37.626 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001c60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf140032d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:16:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:39.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:39.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:39 compute-1 podman[228770]: 2026-01-23 10:16:39.648342998 +0000 UTC m=+0.052454919 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 10:16:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:40 compute-1 ceph-mon[80126]: pgmap v701: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 16 KiB/s wr, 92 op/s
Jan 23 10:16:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80030f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:41 compute-1 ceph-mon[80126]: pgmap v702: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 16 KiB/s wr, 92 op/s
Jan 23 10:16:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80030f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:41.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:41.633 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101642 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:16:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:43 compute-1 sudo[228793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:16:43 compute-1 sudo[228793]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:43 compute-1 sudo[228793]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:43.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:43.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:43 compute-1 ceph-mon[80126]: pgmap v703: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 548 KiB/s rd, 1.2 KiB/s wr, 30 op/s
Jan 23 10:16:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80030f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:45.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:45.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:45 compute-1 ceph-mon[80126]: pgmap v704: 353 pgs: 353 active+clean; 88 MiB data, 220 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 767 B/s wr, 9 op/s
Jan 23 10:16:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:46 compute-1 ceph-mon[80126]: pgmap v705: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 68 op/s
Jan 23 10:16:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:47.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:47.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:49 compute-1 ceph-mon[80126]: pgmap v706: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 389 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Jan 23 10:16:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:49.643 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:49.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:50 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3378120931' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:16:50 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3378120931' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:16:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:16:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:51.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:51 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:51.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:52 compute-1 ceph-mon[80126]: pgmap v707: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 389 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Jan 23 10:16:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:53 compute-1 ceph-mon[80126]: pgmap v708: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 390 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Jan 23 10:16:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:53 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:53.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:53.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:16:55.044 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:16:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:16:55.044 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:16:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:16:55.044 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:16:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:55 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:55.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:55.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:55 compute-1 ceph-mon[80126]: pgmap v709: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 363 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 23 10:16:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:16:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:57.655 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:16:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:16:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:57 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:57.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:58 compute-1 ceph-mon[80126]: pgmap v710: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 368 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 23 10:16:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:16:58 compute-1 podman[228828]: 2026-01-23 10:16:58.668910205 +0000 UTC m=+0.074959159 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:16:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:16:59 compute-1 sudo[228855]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:16:59 compute-1 sudo[228855]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:59 compute-1 sudo[228855]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:59 compute-1 sudo[228880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:16:59 compute-1 sudo[228880]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:16:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:16:59.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:16:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:16:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:16:59.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:16:59 compute-1 sudo[228880]: pam_unix(sudo:session): session closed for user root
Jan 23 10:16:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:16:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:00 compute-1 nova_compute[225705]: 2026-01-23 10:17:00.343 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:00 compute-1 nova_compute[225705]: 2026-01-23 10:17:00.344 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:00 compute-1 nova_compute[225705]: 2026-01-23 10:17:00.344 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:17:00 compute-1 nova_compute[225705]: 2026-01-23 10:17:00.344 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:17:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04001fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:00 compute-1 nova_compute[225705]: 2026-01-23 10:17:00.369 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:17:00 compute-1 nova_compute[225705]: 2026-01-23 10:17:00.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee80016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:01 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:17:01.134 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:17:01 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:17:01.136 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:17:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:01.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:01.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:01 compute-1 nova_compute[225705]: 2026-01-23 10:17:01.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:01 compute-1 nova_compute[225705]: 2026-01-23 10:17:01.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:01 compute-1 nova_compute[225705]: 2026-01-23 10:17:01.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:17:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:01 compute-1 ceph-mon[80126]: pgmap v711: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 6.2 KiB/s rd, 12 KiB/s wr, 1 op/s
Jan 23 10:17:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1666916957' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:02 compute-1 nova_compute[225705]: 2026-01-23 10:17:02.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:03 compute-1 sudo[228938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:17:03 compute-1 sudo[228938]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:17:03 compute-1 sudo[228938]: pam_unix(sudo:session): session closed for user root
Jan 23 10:17:03 compute-1 ceph-mon[80126]: pgmap v712: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 6.2 KiB/s rd, 12 KiB/s wr, 1 op/s
Jan 23 10:17:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1117145556' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3419588943' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:03 compute-1 ceph-mon[80126]: pgmap v713: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 6.5 KiB/s rd, 16 KiB/s wr, 1 op/s
Jan 23 10:17:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2232532070' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:03 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:03 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:03.665 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:03.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:03 compute-1 nova_compute[225705]: 2026-01-23 10:17:03.868 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:03 compute-1 nova_compute[225705]: 2026-01-23 10:17:03.888 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:03 compute-1 nova_compute[225705]: 2026-01-23 10:17:03.888 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee80016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:03 compute-1 nova_compute[225705]: 2026-01-23 10:17:03.907 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:03 compute-1 nova_compute[225705]: 2026-01-23 10:17:03.907 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:03 compute-1 nova_compute[225705]: 2026-01-23 10:17:03.907 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:03 compute-1 nova_compute[225705]: 2026-01-23 10:17:03.907 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:17:03 compute-1 nova_compute[225705]: 2026-01-23 10:17:03.908 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:17:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2443746606' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:17:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:17:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:17:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:17:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:17:04 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1679498423' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:04 compute-1 nova_compute[225705]: 2026-01-23 10:17:04.391 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:04 compute-1 nova_compute[225705]: 2026-01-23 10:17:04.558 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:17:04 compute-1 nova_compute[225705]: 2026-01-23 10:17:04.560 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5241MB free_disk=59.94271469116211GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:17:04 compute-1 nova_compute[225705]: 2026-01-23 10:17:04.560 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:04 compute-1 nova_compute[225705]: 2026-01-23 10:17:04.560 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:04 compute-1 nova_compute[225705]: 2026-01-23 10:17:04.652 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:17:04 compute-1 nova_compute[225705]: 2026-01-23 10:17:04.653 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:17:04 compute-1 nova_compute[225705]: 2026-01-23 10:17:04.669 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:17:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:05 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:17:05 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/703034000' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:05 compute-1 nova_compute[225705]: 2026-01-23 10:17:05.151 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.482s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:17:05 compute-1 nova_compute[225705]: 2026-01-23 10:17:05.158 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:17:05 compute-1 nova_compute[225705]: 2026-01-23 10:17:05.173 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:17:05 compute-1 nova_compute[225705]: 2026-01-23 10:17:05.176 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:17:05 compute-1 nova_compute[225705]: 2026-01-23 10:17:05.177 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1679498423' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:05 compute-1 ceph-mon[80126]: pgmap v714: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 5.6 KiB/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:17:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:17:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/703034000' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:05.669 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:05.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:06 compute-1 nova_compute[225705]: 2026-01-23 10:17:06.164 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee80016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:07.673 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:07.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:07 compute-1 ceph-mon[80126]: pgmap v715: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 5.6 KiB/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:17:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1232319574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:09 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:17:09.139 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:17:09 compute-1 sudo[229010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:17:09 compute-1 sudo[229010]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:17:09 compute-1 sudo[229010]: pam_unix(sudo:session): session closed for user root
Jan 23 10:17:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:09.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:09.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:10 compute-1 ceph-mon[80126]: pgmap v716: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:17:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:17:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:10 compute-1 podman[229036]: 2026-01-23 10:17:10.682671451 +0000 UTC m=+0.084325505 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 10:17:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:11 compute-1 ceph-mon[80126]: pgmap v717: 353 pgs: 353 active+clean; 121 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:17:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:17:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:11.684 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:17:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:11.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:13 compute-1 ceph-mon[80126]: pgmap v718: 353 pgs: 353 active+clean; 132 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 399 KiB/s wr, 21 op/s
Jan 23 10:17:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:13.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:13.689 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:15.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:15.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:15 compute-1 ceph-mon[80126]: pgmap v719: 353 pgs: 353 active+clean; 132 MiB data, 280 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 396 KiB/s wr, 21 op/s
Jan 23 10:17:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:17 compute-1 ceph-mon[80126]: pgmap v720: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 23 10:17:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:17.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:17.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:17:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:19 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:19.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:19.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:19 compute-1 ceph-mon[80126]: pgmap v721: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 23 10:17:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:17:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2560479743' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:17:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3266007736' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:17:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:17:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:21 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:21.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:21.701 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:22 compute-1 ceph-mon[80126]: pgmap v722: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 23 10:17:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:23 compute-1 ceph-mon[80126]: pgmap v723: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 34 op/s
Jan 23 10:17:23 compute-1 sudo[229061]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:17:23 compute-1 sudo[229061]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:17:23 compute-1 sudo[229061]: pam_unix(sudo:session): session closed for user root
Jan 23 10:17:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:17:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:23.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:23 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:23.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:25.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:25.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:26 compute-1 ceph-mon[80126]: pgmap v724: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 9.2 KiB/s rd, 1.4 MiB/s wr, 13 op/s
Jan 23 10:17:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003cb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:27 compute-1 ceph-mon[80126]: pgmap v725: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 9.2 KiB/s rd, 1.4 MiB/s wr, 13 op/s
Jan 23 10:17:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:27.710 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:27.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003fe0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:29 compute-1 podman[229092]: 2026-01-23 10:17:29.685608472 +0000 UTC m=+0.090637696 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller)
Jan 23 10:17:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:29.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:17:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:29 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:29.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:29 compute-1 ceph-mon[80126]: pgmap v726: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:17:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:31 compute-1 ceph-mon[80126]: pgmap v727: 353 pgs: 353 active+clean; 167 MiB data, 301 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:17:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:17:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:31.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:31 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:31.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:33 compute-1 ceph-mon[80126]: pgmap v728: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 21 KiB/s wr, 75 op/s
Jan 23 10:17:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:33.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2efcf5d0 =====
Jan 23 10:17:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2efcf5d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:33 compute-1 radosgw[83743]: beast: 0x7f0a2efcf5d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:33.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec000d20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:35.722 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:35.724 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:35 compute-1 ceph-mon[80126]: pgmap v729: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 21 KiB/s wr, 75 op/s
Jan 23 10:17:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:17:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001840 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:36 compute-1 ceph-mon[80126]: pgmap v730: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 22 KiB/s wr, 75 op/s
Jan 23 10:17:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:37.726 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:37.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001840 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:39 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 10:17:39 compute-1 ceph-mon[80126]: pgmap v731: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 22 KiB/s wr, 75 op/s
Jan 23 10:17:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:39.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:39.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003880 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:41 compute-1 podman[229126]: 2026-01-23 10:17:41.66241504 +0000 UTC m=+0.061464583 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:17:41 compute-1 ceph-mon[80126]: pgmap v732: 353 pgs: 353 active+clean; 167 MiB data, 302 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 22 KiB/s wr, 75 op/s
Jan 23 10:17:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:41.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:41.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001840 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:43 compute-1 sudo[229145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:17:43 compute-1 sudo[229145]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:17:43 compute-1 sudo[229145]: pam_unix(sudo:session): session closed for user root
Jan 23 10:17:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:43 compute-1 ceph-mon[80126]: pgmap v733: 353 pgs: 353 active+clean; 197 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.1 MiB/s wr, 132 op/s
Jan 23 10:17:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:43.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:43.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003880 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec002cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:45 compute-1 ceph-mon[80126]: pgmap v734: 353 pgs: 353 active+clean; 197 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 207 KiB/s rd, 2.1 MiB/s wr, 57 op/s
Jan 23 10:17:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:45.739 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:45.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003880 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec002cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:47.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:47.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:47 compute-1 ceph-mon[80126]: pgmap v735: 353 pgs: 353 active+clean; 200 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 257 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:17:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3688632625' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:17:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3688632625' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:17:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003880 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:49.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:49.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:49 compute-1 ceph-mon[80126]: pgmap v736: 353 pgs: 353 active+clean; 200 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 257 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:17:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec002cd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:17:50 compute-1 ceph-mon[80126]: pgmap v737: 353 pgs: 353 active+clean; 200 MiB data, 344 MiB used, 60 GiB / 60 GiB avail; 257 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:17:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003c30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:51.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:51.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003880 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:51 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1982629231' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:52 compute-1 sshd-session[229175]: Invalid user sol from 45.148.10.240 port 60478
Jan 23 10:17:52 compute-1 sshd-session[229175]: Connection closed by invalid user sol 45.148.10.240 port 60478 [preauth]
Jan 23 10:17:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003dd0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:53 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1712693474' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:17:53 compute-1 ceph-mon[80126]: pgmap v738: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 276 KiB/s rd, 2.1 MiB/s wr, 90 op/s
Jan 23 10:17:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_14] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:53.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:53.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003880 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:17:55.045 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:17:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:17:55.045 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:17:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:17:55.046 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:17:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003880 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:55.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:55.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:55 compute-1 ceph-mon[80126]: pgmap v739: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 18 KiB/s wr, 33 op/s
Jan 23 10:17:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003d40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:57 compute-1 ceph-mon[80126]: pgmap v740: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 19 KiB/s wr, 33 op/s
Jan 23 10:17:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:57.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:57.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003880 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:17:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef40016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:17:59 compute-1 ceph-mon[80126]: pgmap v741: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:17:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:17:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:17:59.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:17:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:17:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:17:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:17:59.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:17:59 compute-1 nova_compute[225705]: 2026-01-23 10:17:59.869 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:59 compute-1 nova_compute[225705]: 2026-01-23 10:17:59.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:17:59 compute-1 nova_compute[225705]: 2026-01-23 10:17:59.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:17:59 compute-1 nova_compute[225705]: 2026-01-23 10:17:59.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:17:59 compute-1 nova_compute[225705]: 2026-01-23 10:17:59.892 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:17:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:17:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003d60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:00 compute-1 podman[229183]: 2026-01-23 10:18:00.680262911 +0000 UTC m=+0.089425866 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 10:18:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2675935538' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:00 compute-1 nova_compute[225705]: 2026-01-23 10:18:00.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:01 compute-1 ceph-mon[80126]: pgmap v742: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:18:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2538524120' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:01.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:01.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003d80 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:02 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1665070625' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:02 compute-1 nova_compute[225705]: 2026-01-23 10:18:02.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:03 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:03.419 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:18:03 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:03.421 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:18:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:03 compute-1 sudo[229211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:18:03 compute-1 sudo[229211]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:03 compute-1 sudo[229211]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:03.767 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:03.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:03 compute-1 nova_compute[225705]: 2026-01-23 10:18:03.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:03 compute-1 nova_compute[225705]: 2026-01-23 10:18:03.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:03 compute-1 nova_compute[225705]: 2026-01-23 10:18:03.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:18:03 compute-1 ceph-mon[80126]: pgmap v743: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 38 KiB/s rd, 15 KiB/s wr, 56 op/s
Jan 23 10:18:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2045104684' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:04 compute-1 nova_compute[225705]: 2026-01-23 10:18:04.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:04 compute-1 nova_compute[225705]: 2026-01-23 10:18:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:04 compute-1 nova_compute[225705]: 2026-01-23 10:18:04.894 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:04 compute-1 nova_compute[225705]: 2026-01-23 10:18:04.895 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:04 compute-1 nova_compute[225705]: 2026-01-23 10:18:04.895 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:04 compute-1 nova_compute[225705]: 2026-01-23 10:18:04.895 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:18:04 compute-1 nova_compute[225705]: 2026-01-23 10:18:04.895 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3410514855' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:04 compute-1 ceph-mon[80126]: pgmap v744: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Jan 23 10:18:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003da0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:05 compute-1 nova_compute[225705]: 2026-01-23 10:18:05.375 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:05.422 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:18:05 compute-1 nova_compute[225705]: 2026-01-23 10:18:05.537 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:18:05 compute-1 nova_compute[225705]: 2026-01-23 10:18:05.538 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=5208MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:18:05 compute-1 nova_compute[225705]: 2026-01-23 10:18:05.538 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:05 compute-1 nova_compute[225705]: 2026-01-23 10:18:05.538 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:05 compute-1 nova_compute[225705]: 2026-01-23 10:18:05.598 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:18:05 compute-1 nova_compute[225705]: 2026-01-23 10:18:05.599 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:18:05 compute-1 nova_compute[225705]: 2026-01-23 10:18:05.617 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:05.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:05.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:18:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4166887387' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:06 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:18:06 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4138145315' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:06 compute-1 nova_compute[225705]: 2026-01-23 10:18:06.095 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.478s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:06 compute-1 nova_compute[225705]: 2026-01-23 10:18:06.100 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:18:06 compute-1 nova_compute[225705]: 2026-01-23 10:18:06.117 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:18:06 compute-1 nova_compute[225705]: 2026-01-23 10:18:06.118 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:18:06 compute-1 nova_compute[225705]: 2026-01-23 10:18:06.119 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:06 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4138145315' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:06 compute-1 ceph-mon[80126]: pgmap v745: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Jan 23 10:18:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:07 compute-1 nova_compute[225705]: 2026-01-23 10:18:07.121 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101807 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:18:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:07.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:07.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003dc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:09 compute-1 sudo[229283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:18:09 compute-1 sudo[229283]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:09 compute-1 sudo[229283]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:09 compute-1 sudo[229308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:18:09 compute-1 sudo[229308]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:09 compute-1 ceph-mon[80126]: pgmap v746: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:18:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:09.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:09.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:10 compute-1 sudo[229308]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:18:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:18:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:18:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:18:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:18:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:18:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:18:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.640401) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491640479, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1203, "num_deletes": 251, "total_data_size": 2841138, "memory_usage": 2879920, "flush_reason": "Manual Compaction"}
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491654356, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1850362, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 24631, "largest_seqno": 25829, "table_properties": {"data_size": 1845137, "index_size": 2685, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 11658, "raw_average_key_size": 19, "raw_value_size": 1834496, "raw_average_value_size": 3135, "num_data_blocks": 120, "num_entries": 585, "num_filter_entries": 585, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163397, "oldest_key_time": 1769163397, "file_creation_time": 1769163491, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 14012 microseconds, and 4916 cpu microseconds.
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.654425) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1850362 bytes OK
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.654451) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.656854) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.656918) EVENT_LOG_v1 {"time_micros": 1769163491656907, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.656946) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2835298, prev total WAL file size 2835298, number of live WAL files 2.
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.657999) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730031373537' seq:72057594037927935, type:22 .. '7061786F730032303039' seq:0, type:0; will stop at (end)
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1806KB)], [48(12MB)]
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491658079, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 14993465, "oldest_snapshot_seqno": -1}
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 5452 keys, 12824717 bytes, temperature: kUnknown
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491740662, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 12824717, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12788255, "index_size": 21760, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 13637, "raw_key_size": 139979, "raw_average_key_size": 25, "raw_value_size": 12689105, "raw_average_value_size": 2327, "num_data_blocks": 884, "num_entries": 5452, "num_filter_entries": 5452, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163491, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.741003) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 12824717 bytes
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.742637) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.3 rd, 155.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.8, 12.5 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(15.0) write-amplify(6.9) OK, records in: 5968, records dropped: 516 output_compression: NoCompression
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.742662) EVENT_LOG_v1 {"time_micros": 1769163491742649, "job": 28, "event": "compaction_finished", "compaction_time_micros": 82722, "compaction_time_cpu_micros": 35059, "output_level": 6, "num_output_files": 1, "total_output_size": 12824717, "num_input_records": 5968, "num_output_records": 5452, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491743388, "job": 28, "event": "table_file_deletion", "file_number": 50}
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163491746760, "job": 28, "event": "table_file_deletion", "file_number": 48}
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.657822) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.746900) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.746908) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.746909) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.746911) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:18:11.746912) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:18:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:11.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:11.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003de0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:12 compute-1 ceph-mon[80126]: pgmap v747: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:18:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003de0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:12 compute-1 podman[229366]: 2026-01-23 10:18:12.644312337 +0000 UTC m=+0.049065751 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:18:13 compute-1 ceph-mon[80126]: pgmap v748: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:18:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_16] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:18:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:13.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:18:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:13.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:14 compute-1 ceph-mon[80126]: pgmap v749: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:18:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:15.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:15.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003de0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:16 compute-1 sudo[229388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:18:16 compute-1 sudo[229388]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:16 compute-1 sudo[229388]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:18:16 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:18:16 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:18:16 compute-1 ceph-mon[80126]: pgmap v750: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:18:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:17.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:17.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:18:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:18:19 compute-1 ceph-mon[80126]: pgmap v751: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:18:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:18:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:19.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:19.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:18:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e20 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:21.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:21.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:22 compute-1 ceph-mon[80126]: pgmap v752: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.157 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.158 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.173 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.247 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.248 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.255 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.256 225709 INFO nova.compute.claims [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Claim successful on node compute-1.ctlplane.example.com
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.354 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:18:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:18:22 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/453783286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.842 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.849 225709 DEBUG nova.compute.provider_tree [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.864 225709 DEBUG nova.scheduler.client.report [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.924 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.925 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.986 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 10:18:22 compute-1 nova_compute[225705]: 2026-01-23 10:18:22.986 225709 DEBUG nova.network.neutron [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.017 225709 INFO nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.050 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 10:18:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.145 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.147 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.147 225709 INFO nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Creating image(s)
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.179 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.209 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.238 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.242 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.243 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:23 compute-1 ceph-mon[80126]: pgmap v753: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:18:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/453783286' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:18:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.496 225709 WARNING oslo_policy.policy [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.497 225709 WARNING oslo_policy.policy [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.499 225709 DEBUG nova.policy [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:18:23 compute-1 nova_compute[225705]: 2026-01-23 10:18:23.513 225709 DEBUG nova.virt.libvirt.imagebackend [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image locations are: [{'url': 'rbd://f3005f84-239a-55b6-a948-8f1fb592b920/images/271ec98e-d058-421b-bbfb-4b4a5954c90a/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://f3005f84-239a-55b6-a948-8f1fb592b920/images/271ec98e-d058-421b-bbfb-4b4a5954c90a/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Jan 23 10:18:23 compute-1 sudo[229493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:18:23 compute-1 sudo[229493]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:23 compute-1 sudo[229493]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:23.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:23.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:24 compute-1 nova_compute[225705]: 2026-01-23 10:18:24.381 225709 DEBUG nova.network.neutron [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Successfully created port: e056b1c4-d8ee-40be-ab65-dad6851e9340 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:18:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef80041f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:24 compute-1 nova_compute[225705]: 2026-01-23 10:18:24.591 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:24 compute-1 nova_compute[225705]: 2026-01-23 10:18:24.661 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.part --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:24 compute-1 nova_compute[225705]: 2026-01-23 10:18:24.662 225709 DEBUG nova.virt.images [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] 271ec98e-d058-421b-bbfb-4b4a5954c90a was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Jan 23 10:18:24 compute-1 nova_compute[225705]: 2026-01-23 10:18:24.664 225709 DEBUG nova.privsep.utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 10:18:24 compute-1 nova_compute[225705]: 2026-01-23 10:18:24.664 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.part /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_5] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.127 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.part /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.converted" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.132 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.190 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c.converted --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.191 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.949s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.223 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.228 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c ed3c80d1-b549-49d1-be66-00467e195256_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.505 225709 DEBUG nova.network.neutron [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Successfully updated port: e056b1c4-d8ee-40be-ab65-dad6851e9340 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.523 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.524 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.524 225709 DEBUG nova.network.neutron [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.623 225709 DEBUG nova.compute.manager [req-3ef060de-7efb-4ac7-b244-88b192cf54b9 req-62180d03-5832-420d-a993-9278195c05d5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.624 225709 DEBUG nova.compute.manager [req-3ef060de-7efb-4ac7-b244-88b192cf54b9 req-62180d03-5832-420d-a993-9278195c05d5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing instance network info cache due to event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.625 225709 DEBUG oslo_concurrency.lockutils [req-3ef060de-7efb-4ac7-b244-88b192cf54b9 req-62180d03-5832-420d-a993-9278195c05d5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:18:25 compute-1 nova_compute[225705]: 2026-01-23 10:18:25.677 225709 DEBUG nova.network.neutron [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 10:18:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:25.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:25.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:25 compute-1 ceph-mon[80126]: pgmap v754: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 938 B/s wr, 3 op/s
Jan 23 10:18:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.229 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c ed3c80d1-b549-49d1-be66-00467e195256_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.000s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.298 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.366 225709 DEBUG nova.network.neutron [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.410 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.411 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Instance network_info: |[{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.413 225709 DEBUG oslo_concurrency.lockutils [req-3ef060de-7efb-4ac7-b244-88b192cf54b9 req-62180d03-5832-420d-a993-9278195c05d5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.413 225709 DEBUG nova.network.neutron [req-3ef060de-7efb-4ac7-b244-88b192cf54b9 req-62180d03-5832-420d-a993-9278195c05d5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.421 225709 DEBUG nova.objects.instance [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:18:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.483 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.484 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Ensure instance console log exists: /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.485 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.485 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.486 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.489 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Start _get_guest_xml network_info=[{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encrypted': False, 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encryption_format': None, 'boot_index': 0, 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.496 225709 WARNING nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.503 225709 DEBUG nova.virt.libvirt.host [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.504 225709 DEBUG nova.virt.libvirt.host [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.509 225709 DEBUG nova.virt.libvirt.host [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.510 225709 DEBUG nova.virt.libvirt.host [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.511 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.512 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.512 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.513 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.513 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.513 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.513 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.513 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.514 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.514 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.514 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.514 225709 DEBUG nova.virt.hardware [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.518 225709 DEBUG nova.privsep.utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Jan 23 10:18:26 compute-1 nova_compute[225705]: 2026-01-23 10:18:26.519 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:18:27 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3880301074' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:18:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.314 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.795s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.468 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.473 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:27 compute-1 ceph-mon[80126]: pgmap v755: 353 pgs: 353 active+clean; 88 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 23 10:18:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:27.797 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:27.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:18:27 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1407883863' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.961 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.487s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.964 225709 DEBUG nova.virt.libvirt.vif [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:18:23Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.965 225709 DEBUG nova.network.os_vif_util [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.967 225709 DEBUG nova.network.os_vif_util [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.970 225709 DEBUG nova.objects.instance [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:18:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.986 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] End _get_guest_xml xml=<domain type="kvm">
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <uuid>ed3c80d1-b549-49d1-be66-00467e195256</uuid>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <name>instance-00000003</name>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <memory>131072</memory>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <vcpu>1</vcpu>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <nova:creationTime>2026-01-23 10:18:26</nova:creationTime>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <nova:flavor name="m1.nano">
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <nova:memory>128</nova:memory>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <nova:disk>1</nova:disk>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <nova:swap>0</nova:swap>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <nova:vcpus>1</nova:vcpus>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       </nova:flavor>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <nova:owner>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       </nova:owner>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <nova:ports>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 10:18:27 compute-1 nova_compute[225705]:           <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         </nova:port>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       </nova:ports>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     </nova:instance>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <sysinfo type="smbios">
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <system>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <entry name="manufacturer">RDO</entry>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <entry name="product">OpenStack Compute</entry>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <entry name="serial">ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <entry name="uuid">ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <entry name="family">Virtual Machine</entry>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     </system>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <os>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <boot dev="hd"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <smbios mode="sysinfo"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   </os>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <features>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <vmcoreinfo/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   </features>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <clock offset="utc">
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <timer name="hpet" present="no"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <cpu mode="host-model" match="exact">
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <disk type="network" device="disk">
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/ed3c80d1-b549-49d1-be66-00467e195256_disk">
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       </source>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <target dev="vda" bus="virtio"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <disk type="network" device="cdrom">
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/ed3c80d1-b549-49d1-be66-00467e195256_disk.config">
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       </source>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:18:27 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <target dev="sda" bus="sata"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <interface type="ethernet">
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <mac address="fa:16:3e:42:a1:b7"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <mtu size="1442"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <target dev="tape056b1c4-d8"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <serial type="pty">
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <log file="/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log" append="off"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <video>
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     </video>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <input type="tablet" bus="usb"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <rng model="virtio">
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <backend model="random">/dev/urandom</backend>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <controller type="usb" index="0"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     <memballoon model="virtio">
Jan 23 10:18:27 compute-1 nova_compute[225705]:       <stats period="10"/>
Jan 23 10:18:27 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:18:27 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:18:27 compute-1 nova_compute[225705]: </domain>
Jan 23 10:18:27 compute-1 nova_compute[225705]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.988 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Preparing to wait for external event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.989 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.989 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.989 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.990 225709 DEBUG nova.virt.libvirt.vif [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:18:23Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.991 225709 DEBUG nova.network.os_vif_util [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.991 225709 DEBUG nova.network.os_vif_util [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:18:27 compute-1 nova_compute[225705]: 2026-01-23 10:18:27.992 225709 DEBUG os_vif [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.027 225709 DEBUG ovsdbapp.backend.ovs_idl [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.027 225709 DEBUG ovsdbapp.backend.ovs_idl [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.028 225709 DEBUG ovsdbapp.backend.ovs_idl [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.028 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.029 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [POLLOUT] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.029 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.030 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.031 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.033 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.048 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.049 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.049 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.051 225709 INFO oslo.privsep.daemon [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpyxiu3qvc/privsep.sock']
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.414 225709 DEBUG nova.network.neutron [req-3ef060de-7efb-4ac7-b244-88b192cf54b9 req-62180d03-5832-420d-a993-9278195c05d5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updated VIF entry in instance network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.415 225709 DEBUG nova.network.neutron [req-3ef060de-7efb-4ac7-b244-88b192cf54b9 req-62180d03-5832-420d-a993-9278195c05d5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:18:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.455 225709 DEBUG oslo_concurrency.lockutils [req-3ef060de-7efb-4ac7-b244-88b192cf54b9 req-62180d03-5832-420d-a993-9278195c05d5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:18:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3880301074' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:18:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1407883863' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.750 225709 INFO oslo.privsep.daemon [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Spawned new privsep daemon via rootwrap
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.618 229713 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.622 229713 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.625 229713 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Jan 23 10:18:28 compute-1 nova_compute[225705]: 2026-01-23 10:18:28.625 229713 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229713
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.062 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:29 compute-1 rsyslogd[1006]: imjournal: 3618 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.064 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape056b1c4-d8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.064 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape056b1c4-d8, col_values=(('external_ids', {'iface-id': 'e056b1c4-d8ee-40be-ab65-dad6851e9340', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:42:a1:b7', 'vm-uuid': 'ed3c80d1-b549-49d1-be66-00467e195256'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.066 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:29 compute-1 NetworkManager[48978]: <info>  [1769163509.0678] manager: (tape056b1c4-d8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.068 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.078 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.079 225709 INFO os_vif [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8')
Jan 23 10:18:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.127 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.127 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.127 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:42:a1:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.128 225709 INFO nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Using config drive
Jan 23 10:18:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101829 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.155 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.396 225709 INFO nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Creating config drive at /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.404 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp27pbnrm5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.534 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp27pbnrm5" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.561 225709 DEBUG nova.storage.rbd_utils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ed3c80d1-b549-49d1-be66-00467e195256_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:18:29 compute-1 nova_compute[225705]: 2026-01-23 10:18:29.565 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config ed3c80d1-b549-49d1-be66-00467e195256_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:18:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:29.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:29.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:30 compute-1 ceph-mon[80126]: pgmap v756: 353 pgs: 353 active+clean; 88 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:18:30 compute-1 nova_compute[225705]: 2026-01-23 10:18:30.738 225709 DEBUG oslo_concurrency.processutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config ed3c80d1-b549-49d1-be66-00467e195256_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.173s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:18:30 compute-1 nova_compute[225705]: 2026-01-23 10:18:30.739 225709 INFO nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Deleting local config drive /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/disk.config because it was imported into RBD.
Jan 23 10:18:30 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 23 10:18:30 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 23 10:18:30 compute-1 kernel: tun: Universal TUN/TAP device driver, 1.6
Jan 23 10:18:30 compute-1 kernel: tape056b1c4-d8: entered promiscuous mode
Jan 23 10:18:30 compute-1 NetworkManager[48978]: <info>  [1769163510.8844] manager: (tape056b1c4-d8): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Jan 23 10:18:30 compute-1 ovn_controller[133293]: 2026-01-23T10:18:30Z|00027|binding|INFO|Claiming lport e056b1c4-d8ee-40be-ab65-dad6851e9340 for this chassis.
Jan 23 10:18:30 compute-1 ovn_controller[133293]: 2026-01-23T10:18:30Z|00028|binding|INFO|e056b1c4-d8ee-40be-ab65-dad6851e9340: Claiming fa:16:3e:42:a1:b7 10.100.0.13
Jan 23 10:18:30 compute-1 nova_compute[225705]: 2026-01-23 10:18:30.887 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:30 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:30.907 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:a1:b7 10.100.0.13'], port_security=['fa:16:3e:42:a1:b7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ed3c80d1-b549-49d1-be66-00467e195256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '96259b98-6654-41f6-bfeb-290c4063344e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93789b9e-064c-44b7-b00b-f52ca7e4569d, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=e056b1c4-d8ee-40be-ab65-dad6851e9340) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:18:30 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:30.909 143098 INFO neutron.agent.ovn.metadata.agent [-] Port e056b1c4-d8ee-40be-ab65-dad6851e9340 in datapath 4f467dc5-4a9f-42dc-990e-a2a671c8b09c bound to our chassis
Jan 23 10:18:30 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:30.911 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 4f467dc5-4a9f-42dc-990e-a2a671c8b09c
Jan 23 10:18:30 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:30.912 143098 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpwhp69nko/privsep.sock']
Jan 23 10:18:30 compute-1 systemd-udevd[229839]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:18:30 compute-1 systemd-machined[194551]: New machine qemu-1-instance-00000003.
Jan 23 10:18:30 compute-1 NetworkManager[48978]: <info>  [1769163510.9580] device (tape056b1c4-d8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:18:30 compute-1 NetworkManager[48978]: <info>  [1769163510.9593] device (tape056b1c4-d8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:18:30 compute-1 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Jan 23 10:18:30 compute-1 nova_compute[225705]: 2026-01-23 10:18:30.981 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:30 compute-1 ovn_controller[133293]: 2026-01-23T10:18:30Z|00029|binding|INFO|Setting lport e056b1c4-d8ee-40be-ab65-dad6851e9340 ovn-installed in OVS
Jan 23 10:18:30 compute-1 ovn_controller[133293]: 2026-01-23T10:18:30Z|00030|binding|INFO|Setting lport e056b1c4-d8ee-40be-ab65-dad6851e9340 up in Southbound
Jan 23 10:18:30 compute-1 nova_compute[225705]: 2026-01-23 10:18:30.998 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:31 compute-1 podman[229778]: 2026-01-23 10:18:31.01057181 +0000 UTC m=+0.220245120 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:18:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.172 225709 DEBUG nova.compute.manager [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.172 225709 DEBUG oslo_concurrency.lockutils [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.172 225709 DEBUG oslo_concurrency.lockutils [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.173 225709 DEBUG oslo_concurrency.lockutils [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.173 225709 DEBUG nova.compute.manager [req-2a9ed03b-0747-48a8-a4ba-957f1bd94c98 req-f80d5fc7-ce4e-47ed-bb30-c018185fb2b3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Processing event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.248 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.487 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163511.4864285, ed3c80d1-b549-49d1-be66-00467e195256 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.488 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] VM Started (Lifecycle Event)
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.491 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.496 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.501 225709 INFO nova.virt.libvirt.driver [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Instance spawned successfully.
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.502 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.512 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.520 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.525 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.526 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.526 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.527 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.527 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.527 225709 DEBUG nova.virt.libvirt.driver [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.558 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.559 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163511.4866998, ed3c80d1-b549-49d1-be66-00467e195256 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.559 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] VM Paused (Lifecycle Event)
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.599 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.603 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163511.4953184, ed3c80d1-b549-49d1-be66-00467e195256 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.603 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] VM Resumed (Lifecycle Event)
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.607 225709 INFO nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Took 8.46 seconds to spawn the instance on the hypervisor.
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.608 225709 DEBUG nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.618 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.623 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.650 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:18:31 compute-1 ceph-mon[80126]: pgmap v757: 353 pgs: 353 active+clean; 88 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.670 225709 INFO nova.compute.manager [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Took 9.45 seconds to build instance.
Jan 23 10:18:31 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.678 143098 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 23 10:18:31 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.680 143098 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpwhp69nko/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 23 10:18:31 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.540 229898 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:18:31 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.545 229898 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:18:31 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.546 229898 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Jan 23 10:18:31 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.547 229898 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229898
Jan 23 10:18:31 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:31.683 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[9726addb-4939-421e-90f0-82628f34a560]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:31 compute-1 nova_compute[225705]: 2026-01-23 10:18:31.686 225709 DEBUG oslo_concurrency.lockutils [None req-a0309618-f60f-4774-afe6-42fedbf75eba f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.528s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:31.800 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:31.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.251 229898 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.251 229898 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.251 229898 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.855 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c6d75731-3b88-4dd3-83a7-14415bdb0f31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.856 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap4f467dc5-41 in ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.858 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap4f467dc5-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.858 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[97e83ca2-e11c-4412-8c8c-c416fd3c6d51]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.860 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[47ce72c7-1cf9-452a-99b7-032493d11143]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.884 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[08aaa96c-97cf-4938-bb74-28c75ee08873]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.903 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[610e3f31-122a-4eab-8965-7b3d134f4190]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:32.907 143098 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp2keu1f1z/privsep.sock']
Jan 23 10:18:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:33 compute-1 nova_compute[225705]: 2026-01-23 10:18:33.278 225709 DEBUG nova.compute.manager [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:18:33 compute-1 nova_compute[225705]: 2026-01-23 10:18:33.279 225709 DEBUG oslo_concurrency.lockutils [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:33 compute-1 nova_compute[225705]: 2026-01-23 10:18:33.279 225709 DEBUG oslo_concurrency.lockutils [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:33 compute-1 nova_compute[225705]: 2026-01-23 10:18:33.279 225709 DEBUG oslo_concurrency.lockutils [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:33 compute-1 nova_compute[225705]: 2026-01-23 10:18:33.279 225709 DEBUG nova.compute.manager [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:18:33 compute-1 nova_compute[225705]: 2026-01-23 10:18:33.280 225709 WARNING nova.compute.manager [req-6d366101-4a18-479f-90e6-ecaa387cfc67 req-20588db4-8970-4901-8a34-52717a5d9aa3 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 for instance with vm_state active and task_state None.
Jan 23 10:18:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:33 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.707 143098 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Jan 23 10:18:33 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.708 143098 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp2keu1f1z/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Jan 23 10:18:33 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.606 229913 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:18:33 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.609 229913 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:18:33 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.611 229913 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 23 10:18:33 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.611 229913 INFO oslo.privsep.daemon [-] privsep daemon running as pid 229913
Jan 23 10:18:33 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:33.711 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[22de4e2c-2057-4577-810e-6506987d6680]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:33 compute-1 ceph-mon[80126]: pgmap v758: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 42 op/s
Jan 23 10:18:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:33.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:33.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:34 compute-1 nova_compute[225705]: 2026-01-23 10:18:34.067 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.199 229913 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.199 229913 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.199 229913 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.791 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[00b4d305-dbd0-4555-bb80-fa7032b7d712]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:34 compute-1 NetworkManager[48978]: <info>  [1769163514.8100] manager: (tap4f467dc5-40): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.809 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7bb72c89-c3f6-4a83-aad1-8b88de714140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:34 compute-1 systemd-udevd[229925]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.839 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[e793fa8b-3acc-416c-84f4-ec770a1b48d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.845 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[b4a0221d-4693-4d6f-8845-1482e0f2d80f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:34 compute-1 NetworkManager[48978]: <info>  [1769163514.8696] device (tap4f467dc5-40): carrier: link connected
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.874 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[384fe701-baf2-491a-b10e-7e1c6c20b770]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.895 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[54fe8993-7ca8-45e2-89bc-7b30371bc522]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f467dc5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9b:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466885, 'reachable_time': 23089, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229943, 'error': None, 'target': 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.912 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[374b6fe3-c774-4db1-9c39-056435abf99b]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fec3:9bc5'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 466885, 'tstamp': 466885}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229944, 'error': None, 'target': 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.929 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[14c12a87-50e2-4721-8078-ec197b674db7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap4f467dc5-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:c3:9b:c5'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466885, 'reachable_time': 23089, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229945, 'error': None, 'target': 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:34 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:34.959 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[9a8dda72-a2b7-4d2d-921f-68950dd28b8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.013 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[70e784e0-0f6a-466b-830b-e3c6ad81aaeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.015 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f467dc5-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.015 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.016 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4f467dc5-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:18:35 compute-1 nova_compute[225705]: 2026-01-23 10:18:35.057 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:35 compute-1 kernel: tap4f467dc5-40: entered promiscuous mode
Jan 23 10:18:35 compute-1 NetworkManager[48978]: <info>  [1769163515.0638] manager: (tap4f467dc5-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Jan 23 10:18:35 compute-1 nova_compute[225705]: 2026-01-23 10:18:35.064 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.065 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap4f467dc5-40, col_values=(('external_ids', {'iface-id': '572285ac-9ff4-42d8-9b72-b5588035f74c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:18:35 compute-1 nova_compute[225705]: 2026-01-23 10:18:35.066 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:35 compute-1 ovn_controller[133293]: 2026-01-23T10:18:35Z|00031|binding|INFO|Releasing lport 572285ac-9ff4-42d8-9b72-b5588035f74c from this chassis (sb_readonly=0)
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.070 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/4f467dc5-4a9f-42dc-990e-a2a671c8b09c.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/4f467dc5-4a9f-42dc-990e-a2a671c8b09c.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.071 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[6d623c4d-f850-42ef-b902-8ccaeac0642e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.073 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: global
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     log         /dev/log local0 debug
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     log-tag     haproxy-metadata-proxy-4f467dc5-4a9f-42dc-990e-a2a671c8b09c
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     user        root
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     group       root
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     maxconn     1024
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     pidfile     /var/lib/neutron/external/pids/4f467dc5-4a9f-42dc-990e-a2a671c8b09c.pid.haproxy
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     daemon
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: defaults
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     log global
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     mode http
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     option httplog
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     option dontlognull
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     option http-server-close
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     option forwardfor
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     retries                 3
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     timeout http-request    30s
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     timeout connect         30s
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     timeout client          32s
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     timeout server          32s
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     timeout http-keep-alive 30s
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: listen listener
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     bind 169.254.169.254:80
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:     http-request add-header X-OVN-Network-ID 4f467dc5-4a9f-42dc-990e-a2a671c8b09c
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:18:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:35.074 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'env', 'PROCESS_TAG=haproxy-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/4f467dc5-4a9f-42dc-990e-a2a671c8b09c.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:18:35 compute-1 nova_compute[225705]: 2026-01-23 10:18:35.082 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:35 compute-1 podman[229978]: 2026-01-23 10:18:35.46865314 +0000 UTC m=+0.058749877 container create d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:18:35 compute-1 systemd[1]: Started libpod-conmon-d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f.scope.
Jan 23 10:18:35 compute-1 podman[229978]: 2026-01-23 10:18:35.439739736 +0000 UTC m=+0.029836503 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:18:35 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:18:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/38a430a453066cd300215ffab9c681910b2ee216372ea5d2773756ffea2ac606/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:18:35 compute-1 podman[229978]: 2026-01-23 10:18:35.559752599 +0000 UTC m=+0.149849336 container init d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 10:18:35 compute-1 podman[229978]: 2026-01-23 10:18:35.566614376 +0000 UTC m=+0.156711113 container start d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Jan 23 10:18:35 compute-1 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [NOTICE]   (229998) : New worker (230000) forked
Jan 23 10:18:35 compute-1 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [NOTICE]   (229998) : Loading success.
Jan 23 10:18:35 compute-1 ceph-mon[80126]: pgmap v759: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.8 MiB/s wr, 39 op/s
Jan 23 10:18:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:18:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:35.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:35.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:36 compute-1 nova_compute[225705]: 2026-01-23 10:18:36.282 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:37.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:37.828 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:38 compute-1 ceph-mon[80126]: pgmap v760: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 3.6 MiB/s rd, 1.8 MiB/s wr, 108 op/s
Jan 23 10:18:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <info>  [1769163518.5508] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/27)
Jan 23 10:18:38 compute-1 ovn_controller[133293]: 2026-01-23T10:18:38Z|00032|binding|INFO|Releasing lport 572285ac-9ff4-42d8-9b72-b5588035f74c from this chassis (sb_readonly=0)
Jan 23 10:18:38 compute-1 nova_compute[225705]: 2026-01-23 10:18:38.551 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <info>  [1769163518.5531] device (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <warn>  [1769163518.5535] device (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <info>  [1769163518.5552] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/28)
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <info>  [1769163518.5558] device (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <warn>  [1769163518.5559] device (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <info>  [1769163518.5572] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/29)
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <info>  [1769163518.5582] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <info>  [1769163518.5590] device (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 10:18:38 compute-1 NetworkManager[48978]: <info>  [1769163518.5596] device (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Jan 23 10:18:38 compute-1 ovn_controller[133293]: 2026-01-23T10:18:38Z|00033|binding|INFO|Releasing lport 572285ac-9ff4-42d8-9b72-b5588035f74c from this chassis (sb_readonly=0)
Jan 23 10:18:38 compute-1 nova_compute[225705]: 2026-01-23 10:18:38.585 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:38 compute-1 nova_compute[225705]: 2026-01-23 10:18:38.590 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:38 compute-1 nova_compute[225705]: 2026-01-23 10:18:38.760 225709 DEBUG nova.compute.manager [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:18:38 compute-1 nova_compute[225705]: 2026-01-23 10:18:38.760 225709 DEBUG nova.compute.manager [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing instance network info cache due to event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:18:38 compute-1 nova_compute[225705]: 2026-01-23 10:18:38.760 225709 DEBUG oslo_concurrency.lockutils [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:18:38 compute-1 nova_compute[225705]: 2026-01-23 10:18:38.760 225709 DEBUG oslo_concurrency.lockutils [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:18:38 compute-1 nova_compute[225705]: 2026-01-23 10:18:38.761 225709 DEBUG nova.network.neutron [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:18:39 compute-1 nova_compute[225705]: 2026-01-23 10:18:39.069 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:39 compute-1 ceph-mon[80126]: pgmap v761: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 83 op/s
Jan 23 10:18:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:39.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:39.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec0026f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:40 compute-1 nova_compute[225705]: 2026-01-23 10:18:40.587 225709 DEBUG nova.network.neutron [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updated VIF entry in instance network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:18:40 compute-1 nova_compute[225705]: 2026-01-23 10:18:40.588 225709 DEBUG nova.network.neutron [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:18:40 compute-1 nova_compute[225705]: 2026-01-23 10:18:40.605 225709 DEBUG oslo_concurrency.lockutils [req-4cbe73e7-203d-45fb-9d29-52a335a61b90 req-80114c55-9fb7-469c-8935-976fb18500a5 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:18:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:41 compute-1 nova_compute[225705]: 2026-01-23 10:18:41.284 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:41.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:18:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:41.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:18:41 compute-1 ceph-mon[80126]: pgmap v762: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 83 op/s
Jan 23 10:18:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:43 compute-1 ceph-mon[80126]: pgmap v763: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 13 KiB/s wr, 83 op/s
Jan 23 10:18:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:43 compute-1 podman[230016]: 2026-01-23 10:18:43.688418712 +0000 UTC m=+0.082679693 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 10:18:43 compute-1 sudo[230033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:18:43 compute-1 sudo[230033]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:18:43 compute-1 sudo[230033]: pam_unix(sudo:session): session closed for user root
Jan 23 10:18:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:43.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:43.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:44 compute-1 nova_compute[225705]: 2026-01-23 10:18:44.072 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:45 compute-1 ovn_controller[133293]: 2026-01-23T10:18:45Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:42:a1:b7 10.100.0.13
Jan 23 10:18:45 compute-1 ovn_controller[133293]: 2026-01-23T10:18:45Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:42:a1:b7 10.100.0.13
Jan 23 10:18:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_12] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:45 compute-1 ceph-mon[80126]: pgmap v764: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 68 op/s
Jan 23 10:18:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:45.816 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:45.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:46 compute-1 nova_compute[225705]: 2026-01-23 10:18:46.286 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101847 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 116ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:18:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:18:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:47.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:18:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:47.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:47 compute-1 ceph-mon[80126]: pgmap v765: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 134 op/s
Jan 23 10:18:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:48 compute-1 ceph-mon[80126]: pgmap v766: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 388 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:18:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2862770348' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:18:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2862770348' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:18:49 compute-1 nova_compute[225705]: 2026-01-23 10:18:49.082 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:49.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:49.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_18] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14004cf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:18:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_3] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:51 compute-1 ceph-mon[80126]: pgmap v767: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 388 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:18:51 compute-1 nova_compute[225705]: 2026-01-23 10:18:51.310 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:51 compute-1 nova_compute[225705]: 2026-01-23 10:18:51.545 225709 INFO nova.compute.manager [None req-3b9a1359-602c-4d0e-92e3-2c69b939e4b0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Get console output
Jan 23 10:18:51 compute-1 nova_compute[225705]: 2026-01-23 10:18:51.550 225709 INFO oslo.privsep.daemon [None req-3b9a1359-602c-4d0e-92e3-2c69b939e4b0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmp52lkgdah/privsep.sock']
Jan 23 10:18:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:51.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:51.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:52 compute-1 nova_compute[225705]: 2026-01-23 10:18:52.252 225709 INFO oslo.privsep.daemon [None req-3b9a1359-602c-4d0e-92e3-2c69b939e4b0 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Spawned new privsep daemon via rootwrap
Jan 23 10:18:52 compute-1 nova_compute[225705]: 2026-01-23 10:18:52.099 230072 INFO oslo.privsep.daemon [-] privsep daemon starting
Jan 23 10:18:52 compute-1 nova_compute[225705]: 2026-01-23 10:18:52.104 230072 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Jan 23 10:18:52 compute-1 nova_compute[225705]: 2026-01-23 10:18:52.106 230072 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Jan 23 10:18:52 compute-1 nova_compute[225705]: 2026-01-23 10:18:52.106 230072 INFO oslo.privsep.daemon [-] privsep daemon running as pid 230072
Jan 23 10:18:52 compute-1 nova_compute[225705]: 2026-01-23 10:18:52.348 230072 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 10:18:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040008d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0000b60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:53 compute-1 ceph-mon[80126]: pgmap v768: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 388 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:18:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:53.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:53.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:18:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:54 compute-1 nova_compute[225705]: 2026-01-23 10:18:54.084 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:55.047 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:55.047 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:18:55.048 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:18:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040008d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:55.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:55 compute-1 ceph-mon[80126]: pgmap v769: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 388 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:18:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:55.881 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:56 compute-1 nova_compute[225705]: 2026-01-23 10:18:56.311 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:18:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:57.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:57 compute-1 ceph-mon[80126]: pgmap v770: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 389 KiB/s rd, 2.1 MiB/s wr, 67 op/s
Jan 23 10:18:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:57.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002120 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:18:58 compute-1 ceph-mon[80126]: pgmap v771: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 12 KiB/s wr, 1 op/s
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.086 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:18:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.217 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.218 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.219 225709 DEBUG nova.objects.instance [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'flavor' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.567 225709 DEBUG nova.objects.instance [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_requests' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.587 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.794 225709 DEBUG nova.policy [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:18:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:18:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:18:59.831 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:18:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:18:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:18:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.868 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:18:59 compute-1 nova_compute[225705]: 2026-01-23 10:18:59.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:18:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:18:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:18:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:18:59.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1911174770' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:00 compute-1 nova_compute[225705]: 2026-01-23 10:19:00.362 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:19:00 compute-1 nova_compute[225705]: 2026-01-23 10:19:00.363 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:19:00 compute-1 nova_compute[225705]: 2026-01-23 10:19:00.363 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 10:19:00 compute-1 nova_compute[225705]: 2026-01-23 10:19:00.363 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:19:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002120 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00016a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:01 compute-1 ceph-mon[80126]: pgmap v772: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 12 KiB/s wr, 1 op/s
Jan 23 10:19:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1756463740' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:01 compute-1 nova_compute[225705]: 2026-01-23 10:19:01.313 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:01 compute-1 nova_compute[225705]: 2026-01-23 10:19:01.357 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Successfully created port: 35c98901-92ff-40ab-a9c4-0da34169949c _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:19:01 compute-1 podman[230079]: 2026-01-23 10:19:01.737567892 +0000 UTC m=+0.134832442 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 10:19:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:01.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:01.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.105 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.121 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.121 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.122 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.305 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Successfully updated port: 35c98901-92ff-40ab-a9c4-0da34169949c _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.322 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.322 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.322 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.423 225709 DEBUG nova.compute.manager [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-changed-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.424 225709 DEBUG nova.compute.manager [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing instance network info cache due to event network-changed-35c98901-92ff-40ab-a9c4-0da34169949c. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.424 225709 DEBUG oslo_concurrency.lockutils [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:19:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00041e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:02 compute-1 nova_compute[225705]: 2026-01-23 10:19:02.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:19:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:03 compute-1 ceph-mon[80126]: pgmap v773: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 15 KiB/s wr, 4 op/s
Jan 23 10:19:03 compute-1 sudo[230107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:19:03 compute-1 sudo[230107]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:03 compute-1 sudo[230107]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:03.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:03 compute-1 nova_compute[225705]: 2026-01-23 10:19:03.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:03 compute-1 nova_compute[225705]: 2026-01-23 10:19:03.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:03 compute-1 nova_compute[225705]: 2026-01-23 10:19:03.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:19:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:03.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.088 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2031067823' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.857 225709 DEBUG nova.network.neutron [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.894 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.896 225709 DEBUG oslo_concurrency.lockutils [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.896 225709 DEBUG nova.network.neutron [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing network info cache for port 35c98901-92ff-40ab-a9c4-0da34169949c _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.899 225709 DEBUG nova.virt.libvirt.vif [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.900 225709 DEBUG nova.network.os_vif_util [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.901 225709 DEBUG nova.network.os_vif_util [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.901 225709 DEBUG os_vif [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.902 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.902 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.903 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.907 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.907 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap35c98901-92, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.908 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap35c98901-92, col_values=(('external_ids', {'iface-id': '35c98901-92ff-40ab-a9c4-0da34169949c', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:4c:1e:6d', 'vm-uuid': 'ed3c80d1-b549-49d1-be66-00467e195256'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.909 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:04 compute-1 NetworkManager[48978]: <info>  [1769163544.9102] manager: (tap35c98901-92): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.913 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.920 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.921 225709 INFO os_vif [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92')
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.922 225709 DEBUG nova.virt.libvirt.vif [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.923 225709 DEBUG nova.network.os_vif_util [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.924 225709 DEBUG nova.network.os_vif_util [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.928 225709 DEBUG nova.virt.libvirt.guest [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] attach device xml: <interface type="ethernet">
Jan 23 10:19:04 compute-1 nova_compute[225705]:   <mac address="fa:16:3e:4c:1e:6d"/>
Jan 23 10:19:04 compute-1 nova_compute[225705]:   <model type="virtio"/>
Jan 23 10:19:04 compute-1 nova_compute[225705]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:19:04 compute-1 nova_compute[225705]:   <mtu size="1442"/>
Jan 23 10:19:04 compute-1 nova_compute[225705]:   <target dev="tap35c98901-92"/>
Jan 23 10:19:04 compute-1 nova_compute[225705]: </interface>
Jan 23 10:19:04 compute-1 nova_compute[225705]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 23 10:19:04 compute-1 kernel: tap35c98901-92: entered promiscuous mode
Jan 23 10:19:04 compute-1 NetworkManager[48978]: <info>  [1769163544.9442] manager: (tap35c98901-92): new Tun device (/org/freedesktop/NetworkManager/Devices/32)
Jan 23 10:19:04 compute-1 ovn_controller[133293]: 2026-01-23T10:19:04Z|00034|binding|INFO|Claiming lport 35c98901-92ff-40ab-a9c4-0da34169949c for this chassis.
Jan 23 10:19:04 compute-1 ovn_controller[133293]: 2026-01-23T10:19:04Z|00035|binding|INFO|35c98901-92ff-40ab-a9c4-0da34169949c: Claiming fa:16:3e:4c:1e:6d 10.100.0.26
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.945 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.961 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:1e:6d 10.100.0.26'], port_security=['fa:16:3e:4c:1e:6d 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'ed3c80d1-b549-49d1-be66-00467e195256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b2d21e8-4b70-4725-bde5-4813c876e6bd, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=35c98901-92ff-40ab-a9c4-0da34169949c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:19:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.964 143098 INFO neutron.agent.ovn.metadata.agent [-] Port 35c98901-92ff-40ab-a9c4-0da34169949c in datapath 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a bound to our chassis
Jan 23 10:19:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.966 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a
Jan 23 10:19:04 compute-1 systemd-udevd[230139]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:19:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.985 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[cd5d1aab-1ab5-4069-906e-cd1b1e402ac2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.986 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5c9ea62d-41 in ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.988 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.988 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5c9ea62d-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:19:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.989 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[87d4efa6-a638-44bf-b4d6-279470ea7838]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:04 compute-1 ovn_controller[133293]: 2026-01-23T10:19:04Z|00036|binding|INFO|Setting lport 35c98901-92ff-40ab-a9c4-0da34169949c ovn-installed in OVS
Jan 23 10:19:04 compute-1 ovn_controller[133293]: 2026-01-23T10:19:04Z|00037|binding|INFO|Setting lport 35c98901-92ff-40ab-a9c4-0da34169949c up in Southbound
Jan 23 10:19:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:04.990 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d2a4e7bd-e8cd-4dd4-84f8-c7ba528b0844]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:04 compute-1 nova_compute[225705]: 2026-01-23 10:19:04.991 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:04 compute-1 NetworkManager[48978]: <info>  [1769163544.9994] device (tap35c98901-92): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:19:04 compute-1 NetworkManager[48978]: <info>  [1769163544.9999] device (tap35c98901-92): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.018 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[8a876276-2179-4b3b-9cba-07303de0f6f7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.041 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[83146e2c-bcfe-4615-bed5-57f9940624c9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.050 225709 DEBUG nova.virt.libvirt.driver [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.051 225709 DEBUG nova.virt.libvirt.driver [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.051 225709 DEBUG nova.virt.libvirt.driver [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:42:a1:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.051 225709 DEBUG nova.virt.libvirt.driver [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:4c:1e:6d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.067 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[1e3c3211-2293-4025-a7ca-fe04e78ddbea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 NetworkManager[48978]: <info>  [1769163545.0728] manager: (tap5c9ea62d-40): new Veth device (/org/freedesktop/NetworkManager/Devices/33)
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.072 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[539a0be0-e3d0-415f-b82e-234047e76374]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.084 225709 DEBUG nova.virt.libvirt.guest [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:19:05 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:19:05 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 10:19:05 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:19:05</nova:creationTime>
Jan 23 10:19:05 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:19:05 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:19:05 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:19:05 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:19:05 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:19:05 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:19:05 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 10:19:05 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     <nova:port uuid="35c98901-92ff-40ab-a9c4-0da34169949c">
Jan 23 10:19:05 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 23 10:19:05 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:05 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:19:05 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:19:05 compute-1 nova_compute[225705]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.107 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[d6cf3322-0615-4ff4-be9e-8dd565510322]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.112 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[77721f7d-3e1a-4d62-974d-c43e9cda8e3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.119 225709 DEBUG oslo_concurrency.lockutils [None req-2bc9dceb-36b8-499b-973b-30d8b2c2e774 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.901s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:05 compute-1 NetworkManager[48978]: <info>  [1769163545.1388] device (tap5c9ea62d-40): carrier: link connected
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.147 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c25bba8e-36c2-45c7-8b45-95ca39e446e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004200 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.165 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0f1e6161-9b18-49ba-a525-198c765f8d65]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c9ea62d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:ca:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469912, 'reachable_time': 39567, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230166, 'error': None, 'target': 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.183 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[36412169-3cdb-46b0-93c2-de772429993e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe92:caf4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 469912, 'tstamp': 469912}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230167, 'error': None, 'target': 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.201 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[be23fea0-529d-4ebe-a181-54b0d623cb6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5c9ea62d-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:92:ca:f4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469912, 'reachable_time': 39567, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230168, 'error': None, 'target': 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.232 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[94e12952-adca-4dc7-8957-344cddc7c9ed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.288 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[a018731a-97eb-4719-8776-c464890ee3f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.289 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c9ea62d-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.289 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.289 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5c9ea62d-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.291 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:05 compute-1 NetworkManager[48978]: <info>  [1769163545.2918] manager: (tap5c9ea62d-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Jan 23 10:19:05 compute-1 kernel: tap5c9ea62d-40: entered promiscuous mode
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.295 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5c9ea62d-40, col_values=(('external_ids', {'iface-id': '179215ec-6510-4ebf-a6e5-fe4278583ce3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.296 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:05 compute-1 ovn_controller[133293]: 2026-01-23T10:19:05Z|00038|binding|INFO|Releasing lport 179215ec-6510-4ebf-a6e5-fe4278583ce3 from this chassis (sb_readonly=0)
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.297 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.297 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5c9ea62d-4d78-4e2a-9702-db61ccfdb58a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5c9ea62d-4d78-4e2a-9702-db61ccfdb58a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.298 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[dd87451e-f19f-4b0e-9b69-6ad632e8cab5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.299 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: global
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     log         /dev/log local0 debug
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     log-tag     haproxy-metadata-proxy-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     user        root
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     group       root
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     maxconn     1024
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     pidfile     /var/lib/neutron/external/pids/5c9ea62d-4d78-4e2a-9702-db61ccfdb58a.pid.haproxy
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     daemon
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: defaults
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     log global
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     mode http
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     option httplog
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     option dontlognull
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     option http-server-close
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     option forwardfor
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     retries                 3
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     timeout http-request    30s
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     timeout connect         30s
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     timeout client          32s
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     timeout server          32s
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     timeout http-keep-alive 30s
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: listen listener
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     bind 169.254.169.254:80
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:     http-request add-header X-OVN-Network-ID 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:19:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:05.299 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'env', 'PROCESS_TAG=haproxy-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5c9ea62d-4d78-4e2a-9702-db61ccfdb58a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.308 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.496 225709 DEBUG nova.compute.manager [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.497 225709 DEBUG oslo_concurrency.lockutils [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.497 225709 DEBUG oslo_concurrency.lockutils [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.498 225709 DEBUG oslo_concurrency.lockutils [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.498 225709 DEBUG nova.compute.manager [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.498 225709 WARNING nova.compute.manager [req-d24d927b-8f9d-4308-a12f-7edd7b2f0cda req-7edf3ddb-1843-4b25-b42c-f083a97e9e80 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c for instance with vm_state active and task_state None.
Jan 23 10:19:05 compute-1 podman[230201]: 2026-01-23 10:19:05.667629523 +0000 UTC m=+0.047079020 container create c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:19:05 compute-1 ceph-mon[80126]: pgmap v774: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 3.0 KiB/s rd, 15 KiB/s wr, 4 op/s
Jan 23 10:19:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:19:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4045296728' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:05 compute-1 systemd[1]: Started libpod-conmon-c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad.scope.
Jan 23 10:19:05 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:19:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/89c1e4de03bd97a8d8d560a5b0fc97bed6d4cbd47a0d6d1dbe06563b1dadaf91/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:19:05 compute-1 podman[230201]: 2026-01-23 10:19:05.642152628 +0000 UTC m=+0.021602145 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:19:05 compute-1 podman[230201]: 2026-01-23 10:19:05.750941246 +0000 UTC m=+0.130390743 container init c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:19:05 compute-1 podman[230201]: 2026-01-23 10:19:05.756037397 +0000 UTC m=+0.135486894 container start c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 10:19:05 compute-1 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [NOTICE]   (230220) : New worker (230222) forked
Jan 23 10:19:05 compute-1 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [NOTICE]   (230220) : Loading success.
Jan 23 10:19:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:19:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:05.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:05.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.898 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.899 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.899 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.900 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:19:05 compute-1 nova_compute[225705]: 2026-01-23 10:19:05.900 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.314 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:06 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:19:06 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1043744357' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.370 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.470s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.408 225709 DEBUG nova.network.neutron [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updated VIF entry in instance network info cache for port 35c98901-92ff-40ab-a9c4-0da34169949c. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.409 225709 DEBUG nova.network.neutron [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.437 225709 DEBUG oslo_concurrency.lockutils [req-54287c3e-ae4a-4d1a-a219-f3f15e785ab3 req-afffa04c-8af6-445a-99d1-4d95000b39eb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.455 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.456 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000003 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:19:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.636 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.637 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4719MB free_disk=59.942726135253906GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.637 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.638 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:06 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1043744357' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.718 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance ed3c80d1-b549-49d1-be66-00467e195256 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.718 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.719 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.753 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.813 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-35c98901-92ff-40ab-a9c4-0da34169949c" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.814 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-35c98901-92ff-40ab-a9c4-0da34169949c" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.834 225709 DEBUG nova.objects.instance [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'flavor' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.859 225709 DEBUG nova.virt.libvirt.vif [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.860 225709 DEBUG nova.network.os_vif_util [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.861 225709 DEBUG nova.network.os_vif_util [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.866 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.870 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.872 225709 DEBUG nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Attempting to detach device tap35c98901-92 from instance ed3c80d1-b549-49d1-be66-00467e195256 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.873 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] detach device xml: <interface type="ethernet">
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <mac address="fa:16:3e:4c:1e:6d"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <model type="virtio"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <mtu size="1442"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <target dev="tap35c98901-92"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]: </interface>
Jan 23 10:19:06 compute-1 nova_compute[225705]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.883 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.889 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface>not found in domain: <domain type='kvm' id='1'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <name>instance-00000003</name>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <uuid>ed3c80d1-b549-49d1-be66-00467e195256</uuid>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:19:05</nova:creationTime>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:port uuid="35c98901-92ff-40ab-a9c4-0da34169949c">
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:19:06 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <memory unit='KiB'>131072</memory>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <vcpu placement='static'>1</vcpu>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <resource>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <partition>/machine</partition>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </resource>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <sysinfo type='smbios'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <system>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='manufacturer'>RDO</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='serial'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='uuid'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='family'>Virtual Machine</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </system>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <os>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <boot dev='hd'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <smbios mode='sysinfo'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </os>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <features>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <vmcoreinfo state='on'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </features>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <vendor>AMD</vendor>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='x2apic'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc-deadline'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='hypervisor'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc_adjust'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='spec-ctrl'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='stibp'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='ssbd'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='cmp_legacy'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='overflow-recov'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='succor'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='ibrs'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='amd-ssbd'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='virt-ssbd'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='lbrv'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='tsc-scale'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='vmcb-clean'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='flushbyasid'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='pause-filter'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='pfthreshold'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='xsaves'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='svm'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='topoext'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='npt'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='nrip-save'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <clock offset='utc'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <timer name='hpet' present='no'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <on_poweroff>destroy</on_poweroff>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <on_reboot>restart</on_reboot>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <on_crash>destroy</on_crash>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <disk type='network' device='disk'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk' index='2'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target dev='vda' bus='virtio'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='virtio-disk0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <disk type='network' device='cdrom'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk.config' index='1'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target dev='sda' bus='sata'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <readonly/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='sata0-0-0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pcie.0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='1' port='0x10'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='2' port='0x11'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='3' port='0x12'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.3'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='4' port='0x13'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.4'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='5' port='0x14'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.5'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='6' port='0x15'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.6'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='7' port='0x16'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.7'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='8' port='0x17'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.8'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='9' port='0x18'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.9'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='10' port='0x19'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.10'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='11' port='0x1a'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.11'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='12' port='0x1b'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.12'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='13' port='0x1c'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.13'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='14' port='0x1d'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.14'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='15' port='0x1e'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.15'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='16' port='0x1f'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.16'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='17' port='0x20'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.17'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='18' port='0x21'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.18'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='19' port='0x22'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.19'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='20' port='0x23'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.20'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='21' port='0x24'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.21'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='22' port='0x25'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.22'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='23' port='0x26'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.23'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='24' port='0x27'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.24'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='25' port='0x28'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.25'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-pci-bridge'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.26'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='usb'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='sata' index='0'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='ide'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:42:a1:b7'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target dev='tape056b1c4-d8'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='net0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:4c:1e:6d'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target dev='tap35c98901-92'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='net1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <serial type='pty'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target type='isa-serial' port='0'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <model name='isa-serial'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       </target>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target type='serial' port='0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </console>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <input type='tablet' bus='usb'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='input0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='usb' bus='0' port='1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <input type='mouse' bus='ps2'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='input1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <input type='keyboard' bus='ps2'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='input2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <listen type='address' address='::0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <audio id='1' type='none'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <video>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='video0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </video>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <watchdog model='itco' action='reset'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='watchdog0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </watchdog>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <memballoon model='virtio'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <stats period='10'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='balloon0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <rng model='virtio'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <backend model='random'>/dev/urandom</backend>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='rng0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <label>system_u:system_r:svirt_t:s0:c256,c378</label>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c256,c378</imagelabel>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <label>+107:+107</label>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <imagelabel>+107:+107</imagelabel>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:19:06 compute-1 nova_compute[225705]: </domain>
Jan 23 10:19:06 compute-1 nova_compute[225705]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.900 225709 INFO nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully detached device tap35c98901-92 from instance ed3c80d1-b549-49d1-be66-00467e195256 from the persistent domain config.
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.901 225709 DEBUG nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] (1/8): Attempting to detach device tap35c98901-92 with device alias net1 from instance ed3c80d1-b549-49d1-be66-00467e195256 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.902 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] detach device xml: <interface type="ethernet">
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <mac address="fa:16:3e:4c:1e:6d"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <model type="virtio"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <mtu size="1442"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <target dev="tap35c98901-92"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]: </interface>
Jan 23 10:19:06 compute-1 nova_compute[225705]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 10:19:06 compute-1 kernel: tap35c98901-92 (unregistering): left promiscuous mode
Jan 23 10:19:06 compute-1 NetworkManager[48978]: <info>  [1769163546.9637] device (tap35c98901-92): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:19:06 compute-1 ovn_controller[133293]: 2026-01-23T10:19:06Z|00039|binding|INFO|Releasing lport 35c98901-92ff-40ab-a9c4-0da34169949c from this chassis (sb_readonly=0)
Jan 23 10:19:06 compute-1 ovn_controller[133293]: 2026-01-23T10:19:06Z|00040|binding|INFO|Setting lport 35c98901-92ff-40ab-a9c4-0da34169949c down in Southbound
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.968 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:06 compute-1 ovn_controller[133293]: 2026-01-23T10:19:06Z|00041|binding|INFO|Removing iface tap35c98901-92 ovn-installed in OVS
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.979 225709 DEBUG nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Received event <DeviceRemovedEvent: 1769163546.9786458, ed3c80d1-b549-49d1-be66-00467e195256 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 23 10:19:06 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.978 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:4c:1e:6d 10.100.0.26'], port_security=['fa:16:3e:4c:1e:6d 10.100.0.26'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.26/28', 'neutron:device_id': 'ed3c80d1-b549-49d1-be66-00467e195256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b2d21e8-4b70-4725-bde5-4813c876e6bd, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=35c98901-92ff-40ab-a9c4-0da34169949c) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:19:06 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.979 143098 INFO neutron.agent.ovn.metadata.agent [-] Port 35c98901-92ff-40ab-a9c4-0da34169949c in datapath 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a unbound from our chassis
Jan 23 10:19:06 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.981 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.983 225709 DEBUG nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Start waiting for the detach event from libvirt for device tap35c98901-92 with device alias net1 for instance ed3c80d1-b549-49d1-be66-00467e195256 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 23 10:19:06 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.982 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d53214b7-6d5a-43e1-9a9c-6eff3a496ccb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:06 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:06.982 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a namespace which is not needed anymore
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.984 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:19:06 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.990 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface>not found in domain: <domain type='kvm' id='1'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <name>instance-00000003</name>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <uuid>ed3c80d1-b549-49d1-be66-00467e195256</uuid>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:19:05</nova:creationTime>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <nova:port uuid="35c98901-92ff-40ab-a9c4-0da34169949c">
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.26" ipVersion="4"/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:19:06 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <memory unit='KiB'>131072</memory>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <vcpu placement='static'>1</vcpu>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <resource>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <partition>/machine</partition>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </resource>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <sysinfo type='smbios'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <system>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='manufacturer'>RDO</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='serial'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='uuid'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <entry name='family'>Virtual Machine</entry>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </system>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <os>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <boot dev='hd'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <smbios mode='sysinfo'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </os>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <features>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <vmcoreinfo state='on'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </features>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <vendor>AMD</vendor>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='x2apic'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc-deadline'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='hypervisor'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc_adjust'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='spec-ctrl'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='stibp'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='ssbd'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='cmp_legacy'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='overflow-recov'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='succor'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='ibrs'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='amd-ssbd'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='virt-ssbd'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='lbrv'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='tsc-scale'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='vmcb-clean'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='flushbyasid'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='pause-filter'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='pfthreshold'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='xsaves'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='svm'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='require' name='topoext'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='npt'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <feature policy='disable' name='nrip-save'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <clock offset='utc'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <timer name='hpet' present='no'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <on_poweroff>destroy</on_poweroff>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <on_reboot>restart</on_reboot>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <on_crash>destroy</on_crash>
Jan 23 10:19:06 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <disk type='network' device='disk'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk' index='2'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target dev='vda' bus='virtio'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='virtio-disk0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <disk type='network' device='cdrom'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk.config' index='1'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target dev='sda' bus='sata'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <readonly/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='sata0-0-0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pcie.0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='1' port='0x10'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='2' port='0x11'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='3' port='0x12'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.3'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='4' port='0x13'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.4'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='5' port='0x14'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.5'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='6' port='0x15'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.6'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='7' port='0x16'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.7'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='8' port='0x17'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.8'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='9' port='0x18'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.9'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='10' port='0x19'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.10'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='11' port='0x1a'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.11'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='12' port='0x1b'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.12'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='13' port='0x1c'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.13'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='14' port='0x1d'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.14'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='15' port='0x1e'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.15'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='16' port='0x1f'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.16'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='17' port='0x20'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.17'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='18' port='0x21'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.18'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='19' port='0x22'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.19'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='20' port='0x23'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.20'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='21' port='0x24'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.21'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='22' port='0x25'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.22'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='23' port='0x26'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.23'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='24' port='0x27'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.24'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <target chassis='25' port='0x28'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.25'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <model name='pcie-pci-bridge'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='pci.26'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='usb'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:06 compute-1 nova_compute[225705]:     <controller type='sata' index='0'>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <alias name='ide'/>
Jan 23 10:19:06 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:42:a1:b7'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <target dev='tape056b1c4-d8'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='net0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <serial type='pty'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <target type='isa-serial' port='0'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:         <model name='isa-serial'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       </target>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <target type='serial' port='0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </console>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <input type='tablet' bus='usb'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='input0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <address type='usb' bus='0' port='1'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <input type='mouse' bus='ps2'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='input1'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <input type='keyboard' bus='ps2'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='input2'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <listen type='address' address='::0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <audio id='1' type='none'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <video>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='video0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </video>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <watchdog model='itco' action='reset'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='watchdog0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </watchdog>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <memballoon model='virtio'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <stats period='10'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='balloon0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <rng model='virtio'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <backend model='random'>/dev/urandom</backend>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <alias name='rng0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <label>system_u:system_r:svirt_t:s0:c256,c378</label>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c256,c378</imagelabel>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <label>+107:+107</label>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <imagelabel>+107:+107</imagelabel>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:19:07 compute-1 nova_compute[225705]: </domain>
Jan 23 10:19:07 compute-1 nova_compute[225705]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:06.998 225709 INFO nova.virt.libvirt.driver [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully detached device tap35c98901-92 from instance ed3c80d1-b549-49d1-be66-00467e195256 from the live domain config.
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.000 225709 DEBUG nova.virt.libvirt.vif [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.001 225709 DEBUG nova.network.os_vif_util [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.002 225709 DEBUG nova.network.os_vif_util [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.003 225709 DEBUG os_vif [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.009 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.010 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35c98901-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.012 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.014 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.017 225709 INFO os_vif [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92')
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.018 225709 DEBUG nova.virt.libvirt.guest [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:19:07 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:19:07</nova:creationTime>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 10:19:07 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 10:19:07 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:07 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:19:07 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:19:07 compute-1 nova_compute[225705]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 10:19:07 compute-1 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [NOTICE]   (230220) : haproxy version is 2.8.14-c23fe91
Jan 23 10:19:07 compute-1 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [NOTICE]   (230220) : path to executable is /usr/sbin/haproxy
Jan 23 10:19:07 compute-1 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [ALERT]    (230220) : Current worker (230222) exited with code 143 (Terminated)
Jan 23 10:19:07 compute-1 neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a[230216]: [WARNING]  (230220) : All workers exited. Exiting... (0)
Jan 23 10:19:07 compute-1 systemd[1]: libpod-c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad.scope: Deactivated successfully.
Jan 23 10:19:07 compute-1 podman[230296]: 2026-01-23 10:19:07.127069815 +0000 UTC m=+0.058045815 container died c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:19:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:07 compute-1 systemd[1]: var-lib-containers-storage-overlay-89c1e4de03bd97a8d8d560a5b0fc97bed6d4cbd47a0d6d1dbe06563b1dadaf91-merged.mount: Deactivated successfully.
Jan 23 10:19:07 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad-userdata-shm.mount: Deactivated successfully.
Jan 23 10:19:07 compute-1 podman[230296]: 2026-01-23 10:19:07.174608298 +0000 UTC m=+0.105584278 container cleanup c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:19:07 compute-1 systemd[1]: libpod-conmon-c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad.scope: Deactivated successfully.
Jan 23 10:19:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:19:07 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1377108216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.244 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:07 compute-1 podman[230328]: 2026-01-23 10:19:07.244868929 +0000 UTC m=+0.048757021 container remove c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.249 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:19:07 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.251 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3aa18e58-2b74-42de-842c-fc888ae4d311]: (4, ('Fri Jan 23 10:19:07 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a (c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad)\nc3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad\nFri Jan 23 10:19:07 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a (c3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad)\nc3fd019c8886966037dc6144343ff722b8725702a918eddd1a48f0f7883747ad\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:07 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.252 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[11b42957-3506-467b-8405-1422f38f6c52]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:07 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.253 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5c9ea62d-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.255 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:07 compute-1 kernel: tap5c9ea62d-40: left promiscuous mode
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.269 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:07 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.273 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[57dbed41-dd22-4ef8-998c-fc7ebed18c83]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.284 225709 ERROR nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [req-22f68184-d6e8-4b1f-a131-e6c2a286d387] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID b22b6ed5-7bca-42dc-9b99-6f2ad6853af7.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-22f68184-d6e8-4b1f-a131-e6c2a286d387"}]}
Jan 23 10:19:07 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.290 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[2b2e7789-9964-4558-8ef6-ad2e9005d5c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:07 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.292 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[26166fd5-328b-401d-958b-e43f782be4e5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.301 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:19:07 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.309 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8dd638d4-299a-493d-830c-fb971964a847]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 469904, 'reachable_time': 36842, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230345, 'error': None, 'target': 'ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:07 compute-1 systemd[1]: run-netns-ovnmeta\x2d5c9ea62d\x2d4d78\x2d4e2a\x2d9702\x2ddb61ccfdb58a.mount: Deactivated successfully.
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.321 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.321 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 0, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:19:07 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.321 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5c9ea62d-4d78-4e2a-9702-db61ccfdb58a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:19:07 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:07.322 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[50774e63-3ccb-4198-8cc1-03480a622fda]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.334 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.369 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.415 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.658 225709 DEBUG nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.659 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.659 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.659 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.660 225709 DEBUG nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.660 225709 WARNING nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c for instance with vm_state active and task_state None.
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.660 225709 DEBUG nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-unplugged-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.661 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.661 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.661 225709 DEBUG oslo_concurrency.lockutils [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.662 225709 DEBUG nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-unplugged-35c98901-92ff-40ab-a9c4-0da34169949c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.662 225709 WARNING nova.compute.manager [req-a32aa6a4-2e64-423e-a1b4-ddf88dcdfb6f req-479017b3-23a2-4b33-b465-bd85f4194fb7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-unplugged-35c98901-92ff-40ab-a9c4-0da34169949c for instance with vm_state active and task_state None.
Jan 23 10:19:07 compute-1 ceph-mon[80126]: pgmap v775: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 15 KiB/s rd, 16 KiB/s wr, 5 op/s
Jan 23 10:19:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1377108216' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:07.839 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:07.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:19:07 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3477460864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.926 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:07 compute-1 nova_compute[225705]: 2026-01-23 10:19:07.932 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:19:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:08 compute-1 nova_compute[225705]: 2026-01-23 10:19:08.277 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updated inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 59, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Jan 23 10:19:08 compute-1 nova_compute[225705]: 2026-01-23 10:19:08.278 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Jan 23 10:19:08 compute-1 nova_compute[225705]: 2026-01-23 10:19:08.278 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:19:08 compute-1 nova_compute[225705]: 2026-01-23 10:19:08.302 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:19:08 compute-1 nova_compute[225705]: 2026-01-23 10:19:08.302 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:08 compute-1 nova_compute[225705]: 2026-01-23 10:19:08.558 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:19:08 compute-1 nova_compute[225705]: 2026-01-23 10:19:08.559 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:19:08 compute-1 nova_compute[225705]: 2026-01-23 10:19:08.559 225709 DEBUG nova.network.neutron [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:19:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3477460864' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/101909 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.297 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.330 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.814 225709 DEBUG nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.815 225709 DEBUG oslo_concurrency.lockutils [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.815 225709 DEBUG oslo_concurrency.lockutils [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.816 225709 DEBUG oslo_concurrency.lockutils [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.816 225709 DEBUG nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.817 225709 WARNING nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-35c98901-92ff-40ab-a9c4-0da34169949c for instance with vm_state active and task_state None.
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.817 225709 DEBUG nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-deleted-35c98901-92ff-40ab-a9c4-0da34169949c external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.818 225709 INFO nova.compute.manager [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Neutron deleted interface 35c98901-92ff-40ab-a9c4-0da34169949c; detaching it from the instance and deleting it from the info cache
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.818 225709 DEBUG nova.network.neutron [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:09.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.846 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:09 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:09.847 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:19:09 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:09.849 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.852 225709 DEBUG nova.objects.instance [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lazy-loading 'system_metadata' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:19:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:09.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.906 225709 DEBUG nova.objects.instance [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lazy-loading 'flavor' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:19:09 compute-1 ceph-mon[80126]: pgmap v776: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 3.9 KiB/s wr, 4 op/s
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.941 225709 DEBUG nova.virt.libvirt.vif [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.941 225709 DEBUG nova.network.os_vif_util [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.942 225709 DEBUG nova.network.os_vif_util [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.946 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.949 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface>not found in domain: <domain type='kvm' id='1'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <name>instance-00000003</name>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <uuid>ed3c80d1-b549-49d1-be66-00467e195256</uuid>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:19:07</nova:creationTime>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:19:09 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <memory unit='KiB'>131072</memory>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <vcpu placement='static'>1</vcpu>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <resource>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <partition>/machine</partition>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </resource>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <sysinfo type='smbios'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <system>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='manufacturer'>RDO</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='serial'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='uuid'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='family'>Virtual Machine</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </system>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <os>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <boot dev='hd'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <smbios mode='sysinfo'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </os>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <features>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <vmcoreinfo state='on'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </features>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <vendor>AMD</vendor>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='x2apic'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc-deadline'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='hypervisor'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc_adjust'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='spec-ctrl'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='stibp'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='ssbd'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='cmp_legacy'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='overflow-recov'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='succor'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='ibrs'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='amd-ssbd'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='virt-ssbd'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='lbrv'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='tsc-scale'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='vmcb-clean'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='flushbyasid'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='pause-filter'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='pfthreshold'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='xsaves'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='svm'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='topoext'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='npt'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='nrip-save'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <clock offset='utc'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <timer name='hpet' present='no'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <on_poweroff>destroy</on_poweroff>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <on_reboot>restart</on_reboot>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <on_crash>destroy</on_crash>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <disk type='network' device='disk'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk' index='2'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target dev='vda' bus='virtio'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='virtio-disk0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <disk type='network' device='cdrom'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk.config' index='1'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target dev='sda' bus='sata'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <readonly/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='sata0-0-0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pcie.0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='1' port='0x10'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='2' port='0x11'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='3' port='0x12'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.3'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='4' port='0x13'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.4'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='5' port='0x14'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.5'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='6' port='0x15'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.6'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='7' port='0x16'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='8' port='0x17'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.8'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='9' port='0x18'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.9'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='10' port='0x19'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.10'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='11' port='0x1a'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.11'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='12' port='0x1b'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.12'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='13' port='0x1c'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.13'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='14' port='0x1d'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.14'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='15' port='0x1e'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.15'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='16' port='0x1f'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.16'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='17' port='0x20'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.17'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='18' port='0x21'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.18'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='19' port='0x22'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.19'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='20' port='0x23'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.20'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='21' port='0x24'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.21'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='22' port='0x25'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.22'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='23' port='0x26'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.23'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='24' port='0x27'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.24'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='25' port='0x28'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.25'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-pci-bridge'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.26'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='usb'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='sata' index='0'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='ide'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:42:a1:b7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target dev='tape056b1c4-d8'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='net0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <serial type='pty'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target type='isa-serial' port='0'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <model name='isa-serial'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </target>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target type='serial' port='0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </console>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <input type='tablet' bus='usb'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='input0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='usb' bus='0' port='1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <input type='mouse' bus='ps2'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='input1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <input type='keyboard' bus='ps2'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='input2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <listen type='address' address='::0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <audio id='1' type='none'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <video>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='video0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </video>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <watchdog model='itco' action='reset'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='watchdog0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </watchdog>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <memballoon model='virtio'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <stats period='10'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='balloon0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <rng model='virtio'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <backend model='random'>/dev/urandom</backend>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='rng0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <label>system_u:system_r:svirt_t:s0:c256,c378</label>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c256,c378</imagelabel>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <label>+107:+107</label>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <imagelabel>+107:+107</imagelabel>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:19:09 compute-1 nova_compute[225705]: </domain>
Jan 23 10:19:09 compute-1 nova_compute[225705]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.950 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.953 225709 INFO nova.network.neutron [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Port 35c98901-92ff-40ab-a9c4-0da34169949c from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.954 225709 DEBUG nova.network.neutron [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.960 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:4c:1e:6d"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tap35c98901-92"/></interface>not found in domain: <domain type='kvm' id='1'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <name>instance-00000003</name>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <uuid>ed3c80d1-b549-49d1-be66-00467e195256</uuid>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:19:07</nova:creationTime>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:19:09 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <memory unit='KiB'>131072</memory>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <vcpu placement='static'>1</vcpu>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <resource>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <partition>/machine</partition>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </resource>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <sysinfo type='smbios'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <system>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='manufacturer'>RDO</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='serial'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='uuid'>ed3c80d1-b549-49d1-be66-00467e195256</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <entry name='family'>Virtual Machine</entry>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </system>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <os>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <boot dev='hd'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <smbios mode='sysinfo'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </os>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <features>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <vmcoreinfo state='on'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </features>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <vendor>AMD</vendor>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='x2apic'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc-deadline'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='hypervisor'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc_adjust'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='spec-ctrl'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='stibp'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='ssbd'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='cmp_legacy'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='overflow-recov'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='succor'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='ibrs'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='amd-ssbd'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='virt-ssbd'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='lbrv'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='tsc-scale'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='vmcb-clean'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='flushbyasid'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='pause-filter'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='pfthreshold'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='xsaves'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='svm'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='require' name='topoext'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='npt'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='nrip-save'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <clock offset='utc'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <timer name='hpet' present='no'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <on_poweroff>destroy</on_poweroff>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <on_reboot>restart</on_reboot>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <on_crash>destroy</on_crash>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <disk type='network' device='disk'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk' index='2'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target dev='vda' bus='virtio'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='virtio-disk0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <disk type='network' device='cdrom'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/ed3c80d1-b549-49d1-be66-00467e195256_disk.config' index='1'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target dev='sda' bus='sata'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <readonly/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='sata0-0-0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pcie.0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='1' port='0x10'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='2' port='0x11'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='3' port='0x12'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.3'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='4' port='0x13'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.4'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='5' port='0x14'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.5'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='6' port='0x15'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.6'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='7' port='0x16'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='8' port='0x17'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.8'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='9' port='0x18'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.9'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='10' port='0x19'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.10'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='11' port='0x1a'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.11'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='12' port='0x1b'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.12'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='13' port='0x1c'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.13'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='14' port='0x1d'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.14'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='15' port='0x1e'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.15'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='16' port='0x1f'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.16'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='17' port='0x20'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.17'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='18' port='0x21'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.18'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='19' port='0x22'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.19'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='20' port='0x23'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.20'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='21' port='0x24'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.21'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='22' port='0x25'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.22'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='23' port='0x26'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.23'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='24' port='0x27'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.24'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target chassis='25' port='0x28'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.25'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model name='pcie-pci-bridge'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='pci.26'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='usb'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <controller type='sata' index='0'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='ide'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:42:a1:b7'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target dev='tape056b1c4-d8'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='net0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <serial type='pty'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target type='isa-serial' port='0'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:         <model name='isa-serial'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       </target>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256/console.log' append='off'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <target type='serial' port='0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </console>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <input type='tablet' bus='usb'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='input0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='usb' bus='0' port='1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <input type='mouse' bus='ps2'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='input1'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <input type='keyboard' bus='ps2'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='input2'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <listen type='address' address='::0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <audio id='1' type='none'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <video>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='video0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </video>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <watchdog model='itco' action='reset'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='watchdog0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </watchdog>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <memballoon model='virtio'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <stats period='10'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='balloon0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <rng model='virtio'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <backend model='random'>/dev/urandom</backend>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <alias name='rng0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <label>system_u:system_r:svirt_t:s0:c256,c378</label>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c256,c378</imagelabel>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <label>+107:+107</label>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <imagelabel>+107:+107</imagelabel>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:19:09 compute-1 nova_compute[225705]: </domain>
Jan 23 10:19:09 compute-1 nova_compute[225705]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.962 225709 WARNING nova.virt.libvirt.driver [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Detaching interface fa:16:3e:4c:1e:6d failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tap35c98901-92' not found.
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.963 225709 DEBUG nova.virt.libvirt.vif [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.963 225709 DEBUG nova.network.os_vif_util [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converting VIF {"id": "35c98901-92ff-40ab-a9c4-0da34169949c", "address": "fa:16:3e:4c:1e:6d", "network": {"id": "5c9ea62d-4d78-4e2a-9702-db61ccfdb58a", "bridge": "br-int", "label": "tempest-network-smoke--1062451126", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.26", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap35c98901-92", "ovs_interfaceid": "35c98901-92ff-40ab-a9c4-0da34169949c", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.963 225709 DEBUG nova.network.os_vif_util [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.964 225709 DEBUG os_vif [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.966 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.966 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap35c98901-92, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.967 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.969 225709 INFO os_vif [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:4c:1e:6d,bridge_name='br-int',has_traffic_filtering=True,id=35c98901-92ff-40ab-a9c4-0da34169949c,network=Network(5c9ea62d-4d78-4e2a-9702-db61ccfdb58a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap35c98901-92')
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.970 225709 DEBUG nova.virt.libvirt.guest [req-85788626-79ff-490e-a5f7-85a30f3becc8 req-290a6aa0-ff8f-4682-82d1-0cbbfb68d628 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1946325722</nova:name>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:19:09</nova:creationTime>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     <nova:port uuid="e056b1c4-d8ee-40be-ab65-dad6851e9340">
Jan 23 10:19:09 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Jan 23 10:19:09 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:19:09 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:19:09 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:19:09 compute-1 nova_compute[225705]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.973 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:19:09 compute-1 nova_compute[225705]: 2026-01-23 10:19:09.990 225709 DEBUG oslo_concurrency.lockutils [None req-3094ce61-221e-46d5-a252-7cbc9122a695 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-ed3c80d1-b549-49d1-be66-00467e195256-35c98901-92ff-40ab-a9c4-0da34169949c" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 3.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:10 compute-1 ovn_controller[133293]: 2026-01-23T10:19:10Z|00042|binding|INFO|Releasing lport 572285ac-9ff4-42d8-9b72-b5588035f74c from this chassis (sb_readonly=0)
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.261 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04002e30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.791 225709 DEBUG nova.compute.manager [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.791 225709 DEBUG nova.compute.manager [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing instance network info cache due to event network-changed-e056b1c4-d8ee-40be-ab65-dad6851e9340. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.791 225709 DEBUG oslo_concurrency.lockutils [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.792 225709 DEBUG oslo_concurrency.lockutils [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.792 225709 DEBUG nova.network.neutron [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Refreshing network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.854 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.855 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.855 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.855 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.856 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.857 225709 INFO nova.compute.manager [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Terminating instance
Jan 23 10:19:10 compute-1 nova_compute[225705]: 2026-01-23 10:19:10.858 225709 DEBUG nova.compute.manager [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 10:19:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0003430 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:11 compute-1 nova_compute[225705]: 2026-01-23 10:19:11.317 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:11 compute-1 kernel: tape056b1c4-d8 (unregistering): left promiscuous mode
Jan 23 10:19:11 compute-1 NetworkManager[48978]: <info>  [1769163551.8087] device (tape056b1c4-d8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:19:11 compute-1 ovn_controller[133293]: 2026-01-23T10:19:11Z|00043|binding|INFO|Releasing lport e056b1c4-d8ee-40be-ab65-dad6851e9340 from this chassis (sb_readonly=0)
Jan 23 10:19:11 compute-1 ovn_controller[133293]: 2026-01-23T10:19:11Z|00044|binding|INFO|Setting lport e056b1c4-d8ee-40be-ab65-dad6851e9340 down in Southbound
Jan 23 10:19:11 compute-1 nova_compute[225705]: 2026-01-23 10:19:11.813 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:11 compute-1 ovn_controller[133293]: 2026-01-23T10:19:11Z|00045|binding|INFO|Removing iface tape056b1c4-d8 ovn-installed in OVS
Jan 23 10:19:11 compute-1 nova_compute[225705]: 2026-01-23 10:19:11.815 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.820 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:42:a1:b7 10.100.0.13'], port_security=['fa:16:3e:42:a1:b7 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': 'ed3c80d1-b549-49d1-be66-00467e195256', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '96259b98-6654-41f6-bfeb-290c4063344e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=93789b9e-064c-44b7-b00b-f52ca7e4569d, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=e056b1c4-d8ee-40be-ab65-dad6851e9340) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:19:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.821 143098 INFO neutron.agent.ovn.metadata.agent [-] Port e056b1c4-d8ee-40be-ab65-dad6851e9340 in datapath 4f467dc5-4a9f-42dc-990e-a2a671c8b09c unbound from our chassis
Jan 23 10:19:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.822 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4f467dc5-4a9f-42dc-990e-a2a671c8b09c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:19:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.823 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[628bb562-1f1f-44c5-893b-9ade97e3c9fe]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:11.824 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c namespace which is not needed anymore
Jan 23 10:19:11 compute-1 nova_compute[225705]: 2026-01-23 10:19:11.830 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:11.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:11 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Jan 23 10:19:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:11.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:11 compute-1 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 14.726s CPU time.
Jan 23 10:19:11 compute-1 systemd-machined[194551]: Machine qemu-1-instance-00000003 terminated.
Jan 23 10:19:11 compute-1 nova_compute[225705]: 2026-01-23 10:19:11.921 225709 DEBUG nova.network.neutron [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updated VIF entry in instance network info cache for port e056b1c4-d8ee-40be-ab65-dad6851e9340. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:19:11 compute-1 nova_compute[225705]: 2026-01-23 10:19:11.923 225709 DEBUG nova.network.neutron [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [{"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:11 compute-1 ceph-mon[80126]: pgmap v777: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 3.9 KiB/s wr, 4 op/s
Jan 23 10:19:11 compute-1 nova_compute[225705]: 2026-01-23 10:19:11.940 225709 DEBUG oslo_concurrency.lockutils [req-69565c02-54a7-4ecb-9ef9-a6ef2e309ca9 req-4a59d80f-5e69-4b98-8a7b-839c0edf1765 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ed3c80d1-b549-49d1-be66-00467e195256" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:19:11 compute-1 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [NOTICE]   (229998) : haproxy version is 2.8.14-c23fe91
Jan 23 10:19:11 compute-1 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [NOTICE]   (229998) : path to executable is /usr/sbin/haproxy
Jan 23 10:19:11 compute-1 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [WARNING]  (229998) : Exiting Master process...
Jan 23 10:19:11 compute-1 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [ALERT]    (229998) : Current worker (230000) exited with code 143 (Terminated)
Jan 23 10:19:11 compute-1 neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c[229994]: [WARNING]  (229998) : All workers exited. Exiting... (0)
Jan 23 10:19:11 compute-1 systemd[1]: libpod-d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f.scope: Deactivated successfully.
Jan 23 10:19:11 compute-1 podman[230396]: 2026-01-23 10:19:11.972366187 +0000 UTC m=+0.059337667 container died d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.007 225709 DEBUG nova.compute.manager [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-unplugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.008 225709 DEBUG oslo_concurrency.lockutils [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.008 225709 DEBUG oslo_concurrency.lockutils [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.009 225709 DEBUG oslo_concurrency.lockutils [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.009 225709 DEBUG nova.compute.manager [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-unplugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.009 225709 DEBUG nova.compute.manager [req-60b51129-b437-48e6-ad15-143d11b967f9 req-7bcda56a-7c5f-463e-8963-cc83eec10d3f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-unplugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.012 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.083 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.091 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.100 225709 INFO nova.virt.libvirt.driver [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Instance destroyed successfully.
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.100 225709 DEBUG nova.objects.instance [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid ed3c80d1-b549-49d1-be66-00467e195256 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.116 225709 DEBUG nova.virt.libvirt.vif [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:18:21Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1946325722',display_name='tempest-TestNetworkBasicOps-server-1946325722',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1946325722',id=3,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNFuKmAQnfqG+BzzVInGJ8GglaB45/uthcMfVGPcRrvwRC2zblwcemdSLYtLD9+3hgjxhjYUbH8gOwi2FkpNlD5ZcM7M3wudxR6YBz9mthx4NRluv8FsfQLq6ZVWTLX5RA==',key_name='tempest-TestNetworkBasicOps-1418923051',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:18:31Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-i3vy6ive',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:18:31Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ed3c80d1-b549-49d1-be66-00467e195256,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.117 225709 DEBUG nova.network.os_vif_util [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "address": "fa:16:3e:42:a1:b7", "network": {"id": "4f467dc5-4a9f-42dc-990e-a2a671c8b09c", "bridge": "br-int", "label": "tempest-network-smoke--431220831", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.210", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape056b1c4-d8", "ovs_interfaceid": "e056b1c4-d8ee-40be-ab65-dad6851e9340", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.117 225709 DEBUG nova.network.os_vif_util [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.118 225709 DEBUG os_vif [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.119 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.120 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape056b1c4-d8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.126 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:19:12 compute-1 nova_compute[225705]: 2026-01-23 10:19:12.129 225709 INFO os_vif [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:42:a1:b7,bridge_name='br-int',has_traffic_filtering=True,id=e056b1c4-d8ee-40be-ab65-dad6851e9340,network=Network(4f467dc5-4a9f-42dc-990e-a2a671c8b09c),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape056b1c4-d8')
Jan 23 10:19:12 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f-userdata-shm.mount: Deactivated successfully.
Jan 23 10:19:12 compute-1 systemd[1]: var-lib-containers-storage-overlay-38a430a453066cd300215ffab9c681910b2ee216372ea5d2773756ffea2ac606-merged.mount: Deactivated successfully.
Jan 23 10:19:12 compute-1 podman[230396]: 2026-01-23 10:19:12.284209974 +0000 UTC m=+0.371181454 container cleanup d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 10:19:12 compute-1 systemd[1]: libpod-conmon-d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f.scope: Deactivated successfully.
Jan 23 10:19:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004240 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:13 compute-1 podman[230455]: 2026-01-23 10:19:13.159816972 +0000 UTC m=+0.849082270 container remove d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:19:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.166 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e54c0d2a-6dcb-4d34-90ef-4458c51dc2c0]: (4, ('Fri Jan 23 10:19:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c (d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f)\nd1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f\nFri Jan 23 10:19:12 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c (d1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f)\nd1f7ae2d3a582683218cb101226b6a008e249d57657f6d642dc41bdecc13c00f\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.168 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[05cedbd5-9b56-4cf2-b8df-0245ec391ea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.168 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4f467dc5-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:13 compute-1 nova_compute[225705]: 2026-01-23 10:19:13.171 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:13 compute-1 kernel: tap4f467dc5-40: left promiscuous mode
Jan 23 10:19:13 compute-1 nova_compute[225705]: 2026-01-23 10:19:13.185 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.188 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[106658e0-5b25-4f37-b8d1-e374b23af90e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.210 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c33bd82a-c3a3-45cc-ae4c-a9d6c6b9ea98]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.212 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0cc50864-fcd7-4e24-acc8-4861f08f7ec0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.233 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[37a2df42-f322-4271-b0c3-0c4a9b934490]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 466876, 'reachable_time': 19041, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230471, 'error': None, 'target': 'ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.237 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-4f467dc5-4a9f-42dc-990e-a2a671c8b09c deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:19:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:13.237 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6e87f3-67c0-413e-bebb-d832b0059f61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:13 compute-1 systemd[1]: run-netns-ovnmeta\x2d4f467dc5\x2d4a9f\x2d42dc\x2d990e\x2da2a671c8b09c.mount: Deactivated successfully.
Jan 23 10:19:13 compute-1 ceph-mon[80126]: pgmap v778: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 7.9 KiB/s wr, 5 op/s
Jan 23 10:19:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:13.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:13.900 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0003430 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.095 225709 DEBUG nova.compute.manager [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.095 225709 DEBUG oslo_concurrency.lockutils [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ed3c80d1-b549-49d1-be66-00467e195256-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.095 225709 DEBUG oslo_concurrency.lockutils [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.096 225709 DEBUG oslo_concurrency.lockutils [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.096 225709 DEBUG nova.compute.manager [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] No waiting events found dispatching network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.096 225709 WARNING nova.compute.manager [req-ef3c0812-f154-46ff-8218-bf7fb860d605 req-b5b2c26d-320a-425f-84f8-9dd24eeda7c0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received unexpected event network-vif-plugged-e056b1c4-d8ee-40be-ab65-dad6851e9340 for instance with vm_state active and task_state deleting.
Jan 23 10:19:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.521 225709 INFO nova.virt.libvirt.driver [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Deleting instance files /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256_del
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.523 225709 INFO nova.virt.libvirt.driver [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Deletion of /var/lib/nova/instances/ed3c80d1-b549-49d1-be66-00467e195256_del complete
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.573 225709 DEBUG nova.virt.libvirt.host [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.573 225709 INFO nova.virt.libvirt.host [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] UEFI support detected
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.576 225709 INFO nova.compute.manager [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Took 3.72 seconds to destroy the instance on the hypervisor.
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.577 225709 DEBUG oslo.service.loopingcall [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.577 225709 DEBUG nova.compute.manager [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 10:19:14 compute-1 nova_compute[225705]: 2026-01-23 10:19:14.577 225709 DEBUG nova.network.neutron [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 10:19:14 compute-1 podman[230474]: 2026-01-23 10:19:14.706792804 +0000 UTC m=+0.105830376 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 10:19:14 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:14.853 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004260 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:15 compute-1 ceph-mon[80126]: pgmap v779: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 5.1 KiB/s wr, 2 op/s
Jan 23 10:19:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:15.849 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:15.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:16 compute-1 nova_compute[225705]: 2026-01-23 10:19:16.319 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:16 compute-1 sudo[230495]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:19:16 compute-1 sudo[230495]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:16 compute-1 sudo[230495]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:16 compute-1 nova_compute[225705]: 2026-01-23 10:19:16.389 225709 DEBUG nova.network.neutron [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:16 compute-1 nova_compute[225705]: 2026-01-23 10:19:16.406 225709 INFO nova.compute.manager [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Took 1.83 seconds to deallocate network for instance.
Jan 23 10:19:16 compute-1 sudo[230520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:19:16 compute-1 sudo[230520]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:16 compute-1 nova_compute[225705]: 2026-01-23 10:19:16.463 225709 DEBUG nova.compute.manager [req-14d0e6d9-14cb-478c-89a0-bd899dc08df6 req-7923d7bd-ef43-4c21-8512-87132622ad97 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Received event network-vif-deleted-e056b1c4-d8ee-40be-ab65-dad6851e9340 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:16 compute-1 nova_compute[225705]: 2026-01-23 10:19:16.468 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:16 compute-1 nova_compute[225705]: 2026-01-23 10:19:16.468 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:16 compute-1 nova_compute[225705]: 2026-01-23 10:19:16.517 225709 DEBUG oslo_concurrency.processutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:16 compute-1 sudo[230520]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:17 compute-1 nova_compute[225705]: 2026-01-23 10:19:17.031 225709 DEBUG oslo_concurrency.processutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.514s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:17 compute-1 nova_compute[225705]: 2026-01-23 10:19:17.039 225709 DEBUG nova.compute.provider_tree [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:19:17 compute-1 nova_compute[225705]: 2026-01-23 10:19:17.064 225709 DEBUG nova.scheduler.client.report [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:19:17 compute-1 nova_compute[225705]: 2026-01-23 10:19:17.086 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:17 compute-1 nova_compute[225705]: 2026-01-23 10:19:17.117 225709 INFO nova.scheduler.client.report [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance ed3c80d1-b549-49d1-be66-00467e195256
Jan 23 10:19:17 compute-1 nova_compute[225705]: 2026-01-23 10:19:17.123 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:17 compute-1 nova_compute[225705]: 2026-01-23 10:19:17.201 225709 DEBUG oslo_concurrency.lockutils [None req-8c498e1f-c7fa-4b93-bb18-8b916c0a5b47 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ed3c80d1-b549-49d1-be66-00467e195256" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 6.347s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:17 compute-1 ceph-mon[80126]: pgmap v780: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 36 KiB/s rd, 6.2 KiB/s wr, 30 op/s
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/613545648' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:19:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:19:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:17.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:17.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004280 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:19 compute-1 ceph-mon[80126]: pgmap v781: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Jan 23 10:19:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:19.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:19.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:19:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf04003f30 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:21 compute-1 nova_compute[225705]: 2026-01-23 10:19:21.368 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:21.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:21.909 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:21 compute-1 ceph-mon[80126]: pgmap v782: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Jan 23 10:19:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:19:21 compute-1 sudo[230601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:19:21 compute-1 sudo[230601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:22 compute-1 sudo[230601]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_20] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:22 compute-1 nova_compute[225705]: 2026-01-23 10:19:22.125 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:22 compute-1 nova_compute[225705]: 2026-01-23 10:19:22.457 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:22 compute-1 nova_compute[225705]: 2026-01-23 10:19:22.523 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:22 compute-1 ceph-mon[80126]: pgmap v783: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 5.2 KiB/s wr, 28 op/s
Jan 23 10:19:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:23.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:23 compute-1 sudo[230630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:19:23 compute-1 sudo[230630]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:23 compute-1 sudo[230630]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:23.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:25 compute-1 ceph-mon[80126]: pgmap v784: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:19:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:25.860 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:25.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:26 compute-1 nova_compute[225705]: 2026-01-23 10:19:26.371 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:27 compute-1 nova_compute[225705]: 2026-01-23 10:19:27.099 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163552.0976357, ed3c80d1-b549-49d1-be66-00467e195256 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:19:27 compute-1 nova_compute[225705]: 2026-01-23 10:19:27.100 225709 INFO nova.compute.manager [-] [instance: ed3c80d1-b549-49d1-be66-00467e195256] VM Stopped (Lifecycle Event)
Jan 23 10:19:27 compute-1 nova_compute[225705]: 2026-01-23 10:19:27.125 225709 DEBUG nova.compute.manager [None req-83158962-eff8-4a10-ae7d-f24339bf8aec - - - - - -] [instance: ed3c80d1-b549-49d1-be66-00467e195256] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:19:27 compute-1 nova_compute[225705]: 2026-01-23 10:19:27.128 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:27 compute-1 ceph-mon[80126]: pgmap v785: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:19:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:27.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:27.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:29.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:30.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:30 compute-1 ceph-mon[80126]: pgmap v786: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec003a40 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:31 compute-1 nova_compute[225705]: 2026-01-23 10:19:31.373 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:31 compute-1 ceph-mon[80126]: pgmap v787: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:31.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:32 compute-1 nova_compute[225705]: 2026-01-23 10:19:32.131 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:32.419 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:32 compute-1 podman[230659]: 2026-01-23 10:19:32.715060698 +0000 UTC m=+0.106097065 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:19:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:33 compute-1 ceph-mon[80126]: pgmap v788: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:33.866 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:34.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:35 compute-1 ceph-mon[80126]: pgmap v789: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:19:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:35.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:36 compute-1 nova_compute[225705]: 2026-01-23 10:19:36.374 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:36.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:37 compute-1 nova_compute[225705]: 2026-01-23 10:19:37.134 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:37 compute-1 ceph-mon[80126]: pgmap v790: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:19:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:37.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:38.495 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:39 compute-1 nova_compute[225705]: 2026-01-23 10:19:39.462 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:39 compute-1 nova_compute[225705]: 2026-01-23 10:19:39.463 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:39 compute-1 nova_compute[225705]: 2026-01-23 10:19:39.481 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 10:19:39 compute-1 nova_compute[225705]: 2026-01-23 10:19:39.548 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:39 compute-1 nova_compute[225705]: 2026-01-23 10:19:39.549 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:39 compute-1 nova_compute[225705]: 2026-01-23 10:19:39.554 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 10:19:39 compute-1 nova_compute[225705]: 2026-01-23 10:19:39.555 225709 INFO nova.compute.claims [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Claim successful on node compute-1.ctlplane.example.com
Jan 23 10:19:39 compute-1 nova_compute[225705]: 2026-01-23 10:19:39.661 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:39.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:19:40 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3401891283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.110 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.119 225709 DEBUG nova.compute.provider_tree [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.389 225709 DEBUG nova.scheduler.client.report [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.425 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.876s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.426 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 10:19:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.470 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.471 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.489 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 10:19:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:40.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.507 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 10:19:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.595 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.596 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.597 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Creating image(s)
Jan 23 10:19:40 compute-1 nova_compute[225705]: 2026-01-23 10:19:40.624 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:19:40 compute-1 ceph-mon[80126]: pgmap v791: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.018 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.052 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.057 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.080 225709 DEBUG nova.policy [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.117 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.118 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.119 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.120 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.148 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.152 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:41 compute-1 nova_compute[225705]: 2026-01-23 10:19:41.376 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:41.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:42 compute-1 nova_compute[225705]: 2026-01-23 10:19:42.129 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Successfully created port: 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:19:42 compute-1 nova_compute[225705]: 2026-01-23 10:19:42.136 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:42 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3401891283' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:19:42 compute-1 ceph-mon[80126]: pgmap v792: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:19:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:42.500 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:42 compute-1 nova_compute[225705]: 2026-01-23 10:19:42.707 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.554s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:42 compute-1 nova_compute[225705]: 2026-01-23 10:19:42.796 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.025 225709 DEBUG nova.objects.instance [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.045 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.046 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Ensure instance console log exists: /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.047 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.048 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.048 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:43 compute-1 ceph-mon[80126]: pgmap v793: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:19:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.568 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Successfully updated port: 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.596 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.596 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.596 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.713 225709 DEBUG nova.compute.manager [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.713 225709 DEBUG nova.compute.manager [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing instance network info cache due to event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:19:43 compute-1 nova_compute[225705]: 2026-01-23 10:19:43.713 225709 DEBUG oslo_concurrency.lockutils [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:19:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:43.878 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:43 compute-1 sudo[230881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:19:43 compute-1 sudo[230881]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:19:43 compute-1 sudo[230881]: pam_unix(sudo:session): session closed for user root
Jan 23 10:19:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:44 compute-1 nova_compute[225705]: 2026-01-23 10:19:44.478 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 10:19:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:44.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:45 compute-1 podman[230907]: 2026-01-23 10:19:45.650258754 +0000 UTC m=+0.056480007 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Jan 23 10:19:45 compute-1 ceph-mon[80126]: pgmap v794: 353 pgs: 353 active+clean; 41 MiB data, 254 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 0 op/s
Jan 23 10:19:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:45.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.378 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.499 225709 DEBUG nova.network.neutron [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:19:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:46.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.521 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.521 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance network_info: |[{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.522 225709 DEBUG oslo_concurrency.lockutils [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.522 225709 DEBUG nova.network.neutron [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.525 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start _get_guest_xml network_info=[{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encrypted': False, 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encryption_format': None, 'boot_index': 0, 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.529 225709 WARNING nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.532 225709 DEBUG nova.virt.libvirt.host [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.532 225709 DEBUG nova.virt.libvirt.host [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 10:19:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.534 225709 DEBUG nova.virt.libvirt.host [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.535 225709 DEBUG nova.virt.libvirt.host [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.535 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.535 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.536 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.537 225709 DEBUG nova.virt.hardware [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 10:19:46 compute-1 nova_compute[225705]: 2026-01-23 10:19:46.540 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:19:47 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2913217029' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.125 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.585s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.159 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.165 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.186 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:19:47 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3310066042' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.622 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.623 225709 DEBUG nova.virt.libvirt.vif [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:19:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-263648847',display_name='tempest-TestNetworkBasicOps-server-263648847',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-263648847',id=4,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD66+yyfhusLfImf1uKQDIRs2V4/o1F8Eh/Z0SqxwU6ND9wcYC22x/WBjiGE3hyxp/MbA2OwCQVvqoeXerAOWd4tdGub7VFIxVXpqt6OghLL3nU7rH27QJ0mug8wzMNOTA==',key_name='tempest-TestNetworkBasicOps-501443179',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-rvp4p09d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:19:40Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.624 225709 DEBUG nova.network.os_vif_util [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.625 225709 DEBUG nova.network.os_vif_util [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.626 225709 DEBUG nova.objects.instance [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.763 225709 DEBUG nova.network.neutron [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated VIF entry in instance network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.763 225709 DEBUG nova.network.neutron [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.800 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] End _get_guest_xml xml=<domain type="kvm">
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <uuid>87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334</uuid>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <name>instance-00000004</name>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <memory>131072</memory>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <vcpu>1</vcpu>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <nova:name>tempest-TestNetworkBasicOps-server-263648847</nova:name>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <nova:creationTime>2026-01-23 10:19:46</nova:creationTime>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <nova:flavor name="m1.nano">
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <nova:memory>128</nova:memory>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <nova:disk>1</nova:disk>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <nova:swap>0</nova:swap>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <nova:vcpus>1</nova:vcpus>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       </nova:flavor>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <nova:owner>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       </nova:owner>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <nova:ports>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <nova:port uuid="5def28f3-3bf5-4f1f-8e37-51794dbddfc6">
Jan 23 10:19:47 compute-1 nova_compute[225705]:           <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         </nova:port>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       </nova:ports>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     </nova:instance>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <sysinfo type="smbios">
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <system>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <entry name="manufacturer">RDO</entry>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <entry name="product">OpenStack Compute</entry>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <entry name="serial">87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334</entry>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <entry name="uuid">87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334</entry>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <entry name="family">Virtual Machine</entry>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     </system>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <os>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <boot dev="hd"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <smbios mode="sysinfo"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   </os>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <features>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <vmcoreinfo/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   </features>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <clock offset="utc">
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <timer name="hpet" present="no"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <cpu mode="host-model" match="exact">
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <disk type="network" device="disk">
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk">
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <target dev="vda" bus="virtio"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <disk type="network" device="cdrom">
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config">
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       </source>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:19:47 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <target dev="sda" bus="sata"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <interface type="ethernet">
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <mac address="fa:16:3e:d5:a8:9e"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <mtu size="1442"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <target dev="tap5def28f3-3b"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <serial type="pty">
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <log file="/var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/console.log" append="off"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <video>
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     </video>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <input type="tablet" bus="usb"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <rng model="virtio">
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <backend model="random">/dev/urandom</backend>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <controller type="usb" index="0"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     <memballoon model="virtio">
Jan 23 10:19:47 compute-1 nova_compute[225705]:       <stats period="10"/>
Jan 23 10:19:47 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:19:47 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:19:47 compute-1 nova_compute[225705]: </domain>
Jan 23 10:19:47 compute-1 nova_compute[225705]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.801 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Preparing to wait for external event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.801 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.801 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.802 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.802 225709 DEBUG nova.virt.libvirt.vif [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:19:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-263648847',display_name='tempest-TestNetworkBasicOps-server-263648847',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-263648847',id=4,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD66+yyfhusLfImf1uKQDIRs2V4/o1F8Eh/Z0SqxwU6ND9wcYC22x/WBjiGE3hyxp/MbA2OwCQVvqoeXerAOWd4tdGub7VFIxVXpqt6OghLL3nU7rH27QJ0mug8wzMNOTA==',key_name='tempest-TestNetworkBasicOps-501443179',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-rvp4p09d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:19:40Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.803 225709 DEBUG nova.network.os_vif_util [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.803 225709 DEBUG nova.network.os_vif_util [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.803 225709 DEBUG os_vif [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.804 225709 DEBUG oslo_concurrency.lockutils [req-1d5575b6-8dee-4769-be19-512e33f85ef0 req-bb83a0f9-20b7-4a2e-aabb-b37916242242 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.804 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.805 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.805 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.808 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.808 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5def28f3-3b, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.809 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap5def28f3-3b, col_values=(('external_ids', {'iface-id': '5def28f3-3bf5-4f1f-8e37-51794dbddfc6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d5:a8:9e', 'vm-uuid': '87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.811 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:47 compute-1 NetworkManager[48978]: <info>  [1769163587.8126] manager: (tap5def28f3-3b): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.813 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.820 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:47 compute-1 nova_compute[225705]: 2026-01-23 10:19:47.821 225709 INFO os_vif [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b')
Jan 23 10:19:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:47.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.037 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.037 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.037 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:d5:a8:9e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.038 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Using config drive
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.116 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:19:48 compute-1 ceph-mon[80126]: pgmap v795: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:19:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2913217029' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:19:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3310066042' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:19:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:48.511 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.808 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Creating config drive at /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.813 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0x9frigc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.944 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp0x9frigc" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.991 225709 DEBUG nova.storage.rbd_utils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:19:48 compute-1 nova_compute[225705]: 2026-01-23 10:19:48.996 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:19:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00042c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:49 compute-1 ceph-mon[80126]: pgmap v796: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:19:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2936068880' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:19:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2936068880' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.577 225709 DEBUG oslo_concurrency.processutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.581s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.579 225709 INFO nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Deleting local config drive /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334/disk.config because it was imported into RBD.
Jan 23 10:19:49 compute-1 kernel: tap5def28f3-3b: entered promiscuous mode
Jan 23 10:19:49 compute-1 NetworkManager[48978]: <info>  [1769163589.6504] manager: (tap5def28f3-3b): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Jan 23 10:19:49 compute-1 ovn_controller[133293]: 2026-01-23T10:19:49Z|00046|binding|INFO|Claiming lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 for this chassis.
Jan 23 10:19:49 compute-1 ovn_controller[133293]: 2026-01-23T10:19:49Z|00047|binding|INFO|5def28f3-3bf5-4f1f-8e37-51794dbddfc6: Claiming fa:16:3e:d5:a8:9e 10.100.0.8
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.650 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.654 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.669 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a8:9e 10.100.0.8'], port_security=['fa:16:3e:d5:a8:9e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13f52d23-9898-43a0-a951-b69cb2abebab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '077cd29b-8d1e-4ab1-b762-8cd58191c522', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30316727-f942-4d99-94ec-26d1184b5c8a, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=5def28f3-3bf5-4f1f-8e37-51794dbddfc6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.671 143098 INFO neutron.agent.ovn.metadata.agent [-] Port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 in datapath 13f52d23-9898-43a0-a951-b69cb2abebab bound to our chassis
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.672 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 13f52d23-9898-43a0-a951-b69cb2abebab
Jan 23 10:19:49 compute-1 systemd-udevd[231064]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:19:49 compute-1 systemd-machined[194551]: New machine qemu-2-instance-00000004.
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.686 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e166fe2c-ea73-45ab-b57c-4e08a4a79c5f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.687 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap13f52d23-91 in ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.689 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap13f52d23-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.689 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[df06d815-f0ec-4985-b465-452af119deae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.690 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7e96bf8d-16b3-4817-bcdb-b2bddd2e6d9c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 NetworkManager[48978]: <info>  [1769163589.6973] device (tap5def28f3-3b): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:19:49 compute-1 NetworkManager[48978]: <info>  [1769163589.6984] device (tap5def28f3-3b): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.701 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[b830199b-6428-4046-914b-0e767a984559]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.716 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:49 compute-1 systemd[1]: Started Virtual Machine qemu-2-instance-00000004.
Jan 23 10:19:49 compute-1 ovn_controller[133293]: 2026-01-23T10:19:49Z|00048|binding|INFO|Setting lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 ovn-installed in OVS
Jan 23 10:19:49 compute-1 ovn_controller[133293]: 2026-01-23T10:19:49Z|00049|binding|INFO|Setting lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 up in Southbound
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.724 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.728 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8e552358-4a9c-4d13-83b5-169617ff856a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.754 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[d6703ecd-e1e1-4f5d-a8fe-30f977622869]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 NetworkManager[48978]: <info>  [1769163589.7605] manager: (tap13f52d23-90): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.761 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[f36674fa-25eb-4411-9813-42ae764f2486]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.789 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[9242e68a-21c5-4d2b-a2f4-a6afd7d2c7c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.791 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b79957-5801-484c-bdbc-fbb3e5b3d1ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 NetworkManager[48978]: <info>  [1769163589.8158] device (tap13f52d23-90): carrier: link connected
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.823 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[63e79906-d8d9-4c1d-a4c8-e1eb1c481df7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.846 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[ad5880ef-910d-4115-b647-ed225da114d1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13f52d23-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:be:4e:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474379, 'reachable_time': 29178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231098, 'error': None, 'target': 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.864 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0638fca8-0335-4dc9-a78a-532bcd7a96e7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:febe:4e18'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 474379, 'tstamp': 474379}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 231100, 'error': None, 'target': 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:49.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.888 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8f9ec703-314d-42bb-8173-a826256e23dd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap13f52d23-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:be:4e:18'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474379, 'reachable_time': 29178, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 231101, 'error': None, 'target': 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:49.929 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[99521177-1fff-467d-ab2b-8d8554453608]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.934 225709 DEBUG nova.compute.manager [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.934 225709 DEBUG oslo_concurrency.lockutils [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.934 225709 DEBUG oslo_concurrency.lockutils [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.935 225709 DEBUG oslo_concurrency.lockutils [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:49 compute-1 nova_compute[225705]: 2026-01-23 10:19:49.935 225709 DEBUG nova.compute.manager [req-060646d7-ff14-4b03-8801-f05c30ca40e3 req-68884326-48d9-4ecc-ab40-189205a16ef7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Processing event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.000 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[143ebb9f-deee-4e15-b00c-a0785c041a0d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.003 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13f52d23-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.004 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.005 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap13f52d23-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.006 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:50 compute-1 NetworkManager[48978]: <info>  [1769163590.0074] manager: (tap13f52d23-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Jan 23 10:19:50 compute-1 kernel: tap13f52d23-90: entered promiscuous mode
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.013 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap13f52d23-90, col_values=(('external_ids', {'iface-id': 'bc964dd7-f70f-4a1c-80bd-e1f6bfbe809a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.014 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:50 compute-1 ovn_controller[133293]: 2026-01-23T10:19:50Z|00050|binding|INFO|Releasing lport bc964dd7-f70f-4a1c-80bd-e1f6bfbe809a from this chassis (sb_readonly=0)
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.016 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/13f52d23-9898-43a0-a951-b69cb2abebab.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/13f52d23-9898-43a0-a951-b69cb2abebab.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.016 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[a28ee66e-9576-48d8-b54b-de45bbcce73a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.017 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: global
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     log         /dev/log local0 debug
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     log-tag     haproxy-metadata-proxy-13f52d23-9898-43a0-a951-b69cb2abebab
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     user        root
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     group       root
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     maxconn     1024
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     pidfile     /var/lib/neutron/external/pids/13f52d23-9898-43a0-a951-b69cb2abebab.pid.haproxy
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     daemon
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: defaults
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     log global
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     mode http
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     option httplog
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     option dontlognull
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     option http-server-close
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     option forwardfor
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     retries                 3
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     timeout http-request    30s
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     timeout connect         30s
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     timeout client          32s
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     timeout server          32s
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     timeout http-keep-alive 30s
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: listen listener
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     bind 169.254.169.254:80
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:     http-request add-header X-OVN-Network-ID 13f52d23-9898-43a0-a951-b69cb2abebab
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:19:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:50.018 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'env', 'PROCESS_TAG=haproxy-13f52d23-9898-43a0-a951-b69cb2abebab', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/13f52d23-9898-43a0-a951-b69cb2abebab.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.027 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.248 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.249 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163590.2477126, 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.250 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] VM Started (Lifecycle Event)
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.252 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.255 225709 INFO nova.virt.libvirt.driver [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance spawned successfully.
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.256 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.284 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.290 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.291 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.292 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.292 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.293 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.294 225709 DEBUG nova.virt.libvirt.driver [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.298 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.342 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.343 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163590.2479613, 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.343 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] VM Paused (Lifecycle Event)
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.367 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.372 225709 INFO nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Took 9.78 seconds to spawn the instance on the hypervisor.
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.373 225709 DEBUG nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.374 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163590.2521753, 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.374 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] VM Resumed (Lifecycle Event)
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.402 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.406 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:19:50 compute-1 podman[231176]: 2026-01-23 10:19:50.416712822 +0000 UTC m=+0.067575767 container create 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.430 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:19:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec004360 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.444 225709 INFO nova.compute.manager [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Took 10.92 seconds to build instance.
Jan 23 10:19:50 compute-1 nova_compute[225705]: 2026-01-23 10:19:50.461 225709 DEBUG oslo_concurrency.lockutils [None req-297111fd-bf3b-489c-8b0f-9a15ecb5b56d f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.998s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:50 compute-1 podman[231176]: 2026-01-23 10:19:50.374663793 +0000 UTC m=+0.025526788 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:19:50 compute-1 systemd[1]: Started libpod-conmon-4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e.scope.
Jan 23 10:19:50 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:19:50 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/200b86ca9cf8069eb92d9df0e5692d0577d96f4ecb60bc72d1a62929a70d50cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:19:50 compute-1 podman[231176]: 2026-01-23 10:19:50.510079003 +0000 UTC m=+0.160941968 container init 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Jan 23 10:19:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:50.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:50 compute-1 podman[231176]: 2026-01-23 10:19:50.516049682 +0000 UTC m=+0.166912627 container start 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 10:19:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:19:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_17] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:50 compute-1 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [NOTICE]   (231195) : New worker (231198) forked
Jan 23 10:19:50 compute-1 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [NOTICE]   (231195) : Loading success.
Jan 23 10:19:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:51 compute-1 nova_compute[225705]: 2026-01-23 10:19:51.420 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:51 compute-1 ceph-mon[80126]: pgmap v797: 353 pgs: 353 active+clean; 88 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:19:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:19:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1804.3 total, 600.0 interval
                                           Cumulative writes: 10K writes, 39K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 10K writes, 2684 syncs, 4.00 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1689 writes, 4700 keys, 1689 commit groups, 1.0 writes per commit group, ingest: 4.62 MB, 0.01 MB/s
                                           Interval WAL: 1689 writes, 725 syncs, 2.33 writes per sync, written: 0.00 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:19:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:51.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.023 225709 DEBUG nova.compute.manager [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.024 225709 DEBUG oslo_concurrency.lockutils [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.024 225709 DEBUG oslo_concurrency.lockutils [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.025 225709 DEBUG oslo_concurrency.lockutils [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.025 225709 DEBUG nova.compute.manager [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] No waiting events found dispatching network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.025 225709 WARNING nova.compute.manager [req-ea34aebc-fc97-4245-9ebc-e2b3f2828185 req-0049e55b-61e3-4817-b9ae-896bc2c6b2e0 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received unexpected event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 for instance with vm_state active and task_state None.
Jan 23 10:19:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:52.516 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14001ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:52 compute-1 ovn_controller[133293]: 2026-01-23T10:19:52Z|00051|binding|INFO|Releasing lport bc964dd7-f70f-4a1c-80bd-e1f6bfbe809a from this chassis (sb_readonly=0)
Jan 23 10:19:52 compute-1 NetworkManager[48978]: <info>  [1769163592.6714] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.670 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:52 compute-1 NetworkManager[48978]: <info>  [1769163592.6724] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Jan 23 10:19:52 compute-1 ovn_controller[133293]: 2026-01-23T10:19:52Z|00052|binding|INFO|Releasing lport bc964dd7-f70f-4a1c-80bd-e1f6bfbe809a from this chassis (sb_readonly=0)
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.707 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.713 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:52 compute-1 nova_compute[225705]: 2026-01-23 10:19:52.811 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:53 compute-1 nova_compute[225705]: 2026-01-23 10:19:53.112 225709 DEBUG nova.compute.manager [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:19:53 compute-1 nova_compute[225705]: 2026-01-23 10:19:53.113 225709 DEBUG nova.compute.manager [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing instance network info cache due to event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:19:53 compute-1 nova_compute[225705]: 2026-01-23 10:19:53.113 225709 DEBUG oslo_concurrency.lockutils [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:19:53 compute-1 nova_compute[225705]: 2026-01-23 10:19:53.114 225709 DEBUG oslo_concurrency.lockutils [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:19:53 compute-1 nova_compute[225705]: 2026-01-23 10:19:53.114 225709 DEBUG nova.network.neutron [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:19:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004480 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:53 compute-1 ceph-mon[80126]: pgmap v798: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 1.8 MiB/s wr, 79 op/s
Jan 23 10:19:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:53.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:54.519 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:55.048 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:19:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:55.048 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:19:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:19:55.049 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:19:55 compute-1 ceph-mon[80126]: pgmap v799: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.3 MiB/s rd, 1.8 MiB/s wr, 79 op/s
Jan 23 10:19:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002ee0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:55 compute-1 nova_compute[225705]: 2026-01-23 10:19:55.648 225709 DEBUG nova.network.neutron [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated VIF entry in instance network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:19:55 compute-1 nova_compute[225705]: 2026-01-23 10:19:55.649 225709 DEBUG nova.network.neutron [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:19:55 compute-1 nova_compute[225705]: 2026-01-23 10:19:55.721 225709 DEBUG oslo_concurrency.lockutils [req-b9dfdb04-a5d0-4881-b4de-60782bc0d6d3 req-aa9424e6-d603-4468-b156-d74d4314bbac 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:19:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:55.891 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:56 compute-1 nova_compute[225705]: 2026-01-23 10:19:56.422 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00044a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:56.523 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:57 compute-1 nova_compute[225705]: 2026-01-23 10:19:57.815 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:19:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:57.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:19:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002ee0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:19:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:19:58.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:19:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00044c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:19:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:19:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:19:59 compute-1 nova_compute[225705]: 2026-01-23 10:19:59.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:19:59 compute-1 nova_compute[225705]: 2026-01-23 10:19:59.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:19:59 compute-1 nova_compute[225705]: 2026-01-23 10:19:59.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:19:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:19:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:19:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:19:59.897 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:00 compute-1 nova_compute[225705]: 2026-01-23 10:20:00.398 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:20:00 compute-1 nova_compute[225705]: 2026-01-23 10:20:00.399 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:20:00 compute-1 nova_compute[225705]: 2026-01-23 10:20:00.399 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 10:20:00 compute-1 nova_compute[225705]: 2026-01-23 10:20:00.400 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:20:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:00.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002ee0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00044e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:01 compute-1 nova_compute[225705]: 2026-01-23 10:20:01.424 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:01 compute-1 ceph-mds[84630]: mds.beacon.cephfs.compute-1.bcvzvj missed beacon ack from the monitors
Jan 23 10:20:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:01.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:02 compute-1 nova_compute[225705]: 2026-01-23 10:20:02.489 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.174", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:20:02 compute-1 nova_compute[225705]: 2026-01-23 10:20:02.513 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:20:02 compute-1 nova_compute[225705]: 2026-01-23 10:20:02.513 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 10:20:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:02.534 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:02 compute-1 nova_compute[225705]: 2026-01-23 10:20:02.818 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:02 compute-1 nova_compute[225705]: 2026-01-23 10:20:02.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:02 compute-1 nova_compute[225705]: 2026-01-23 10:20:02.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:03 compute-1 podman[231215]: 2026-01-23 10:20:03.681965509 +0000 UTC m=+0.085973678 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 10:20:03 compute-1 nova_compute[225705]: 2026-01-23 10:20:03.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:03.901 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:04 compute-1 sudo[231242]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:20:04 compute-1 sudo[231242]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:04 compute-1 sudo[231242]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_23] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004500 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:04.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:04 compute-1 nova_compute[225705]: 2026-01-23 10:20:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:04 compute-1 nova_compute[225705]: 2026-01-23 10:20:04.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:20:05 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).paxos(paxos updating c 2009..2632) lease_timeout -- calling new election
Jan 23 10:20:05 compute-1 ceph-mon[80126]: log_channel(cluster) log [INF] : mon.compute-1 calling monitor election
Jan 23 10:20:05 compute-1 ceph-mon[80126]: paxos.2).electionLogic(14) init, last seen epoch 14
Jan 23 10:20:05 compute-1 ceph-mon[80126]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 10:20:05 compute-1 ceph-mon[80126]: mon.compute-1@2(electing) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 10:20:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:05 compute-1 ceph-mds[84630]: mds.beacon.cephfs.compute-1.bcvzvj missed beacon ack from the monitors
Jan 23 10:20:05 compute-1 nova_compute[225705]: 2026-01-23 10:20:05.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:05 compute-1 nova_compute[225705]: 2026-01-23 10:20:05.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:05 compute-1 nova_compute[225705]: 2026-01-23 10:20:05.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:05 compute-1 nova_compute[225705]: 2026-01-23 10:20:05.896 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:20:05 compute-1 nova_compute[225705]: 2026-01-23 10:20:05.896 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:20:05 compute-1 nova_compute[225705]: 2026-01-23 10:20:05.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:20:05 compute-1 nova_compute[225705]: 2026-01-23 10:20:05.897 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:20:05 compute-1 nova_compute[225705]: 2026-01-23 10:20:05.897 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:20:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:05.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:06 compute-1 ceph-mon[80126]: mon.compute-1@2(electing) e3 handle_auth_request failed to assign global_id
Jan 23 10:20:06 compute-1 ceph-mon[80126]: mon.compute-1@2(electing) e3 handle_auth_request failed to assign global_id
Jan 23 10:20:06 compute-1 sshd-session[231270]: Invalid user sol from 45.148.10.240 port 45568
Jan 23 10:20:06 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Jan 23 10:20:06 compute-1 sshd-session[231270]: Connection closed by invalid user sol 45.148.10.240 port 45568 [preauth]
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.426 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:06.540 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.594 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.696s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.673 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.673 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000004 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.858 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.860 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4682MB free_disk=59.96738052368164GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.860 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.860 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.941 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.942 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.942 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:20:06 compute-1 nova_compute[225705]: 2026-01-23 10:20:06.987 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:20:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040025a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:20:07 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1764227055' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:07 compute-1 nova_compute[225705]: 2026-01-23 10:20:07.437 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:20:07 compute-1 nova_compute[225705]: 2026-01-23 10:20:07.444 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:20:07 compute-1 nova_compute[225705]: 2026-01-23 10:20:07.470 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:20:07 compute-1 nova_compute[225705]: 2026-01-23 10:20:07.496 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:20:07 compute-1 nova_compute[225705]: 2026-01-23 10:20:07.497 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.637s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:20:07 compute-1 nova_compute[225705]: 2026-01-23 10:20:07.820 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:07.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:08.542 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004790 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:08 compute-1 ceph-mon[80126]: pgmap v801: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Jan 23 10:20:08 compute-1 ceph-mon[80126]: overall HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:20:08 compute-1 ceph-mon[80126]: pgmap v802: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Jan 23 10:20:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1668936281' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2233825662' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-1 ceph-mon[80126]: pgmap v803: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 81 op/s
Jan 23 10:20:08 compute-1 ceph-mon[80126]: pgmap v804: 353 pgs: 353 active+clean; 88 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 709 KiB/s rd, 28 op/s
Jan 23 10:20:08 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-1 ceph-mon[80126]: mon.compute-1 calling monitor election
Jan 23 10:20:08 compute-1 ceph-mon[80126]: mon.compute-0 calling monitor election
Jan 23 10:20:08 compute-1 ceph-mon[80126]: mon.compute-0 is new leader, mons compute-0,compute-2,compute-1 in quorum (ranks 0,1,2)
Jan 23 10:20:08 compute-1 ceph-mon[80126]: monmap epoch 3
Jan 23 10:20:08 compute-1 ceph-mon[80126]: fsid f3005f84-239a-55b6-a948-8f1fb592b920
Jan 23 10:20:08 compute-1 ceph-mon[80126]: last_changed 2026-01-23T09:50:47.540109+0000
Jan 23 10:20:08 compute-1 ceph-mon[80126]: created 2026-01-23T09:47:35.499222+0000
Jan 23 10:20:08 compute-1 ceph-mon[80126]: min_mon_release 19 (squid)
Jan 23 10:20:08 compute-1 ceph-mon[80126]: election_strategy: 1
Jan 23 10:20:08 compute-1 ceph-mon[80126]: 0: [v2:192.168.122.100:3300/0,v1:192.168.122.100:6789/0] mon.compute-0
Jan 23 10:20:08 compute-1 ceph-mon[80126]: 1: [v2:192.168.122.102:3300/0,v1:192.168.122.102:6789/0] mon.compute-2
Jan 23 10:20:08 compute-1 ceph-mon[80126]: 2: [v2:192.168.122.101:3300/0,v1:192.168.122.101:6789/0] mon.compute-1
Jan 23 10:20:08 compute-1 ceph-mon[80126]: fsmap cephfs:1 {0=cephfs.compute-2.prgzmm=up:active} 2 up:standby
Jan 23 10:20:08 compute-1 ceph-mon[80126]: osdmap e146: 3 total, 3 up, 3 in
Jan 23 10:20:08 compute-1 ceph-mon[80126]: mgrmap e32: compute-0.nbdygh(active, since 25m), standbys: compute-2.uczrot, compute-1.jmakme
Jan 23 10:20:08 compute-1 ceph-mon[80126]: Health detail: HEALTH_WARN 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:20:08 compute-1 ceph-mon[80126]: [WRN] BLUESTORE_SLOW_OP_ALERT: 1 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:20:08 compute-1 ceph-mon[80126]:      osd.1 observed slow operation indications in BlueStore
Jan 23 10:20:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1281217101' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2381106667' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:08 compute-1 ceph-mon[80126]: pgmap v805: 353 pgs: 353 active+clean; 92 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 712 KiB/s rd, 588 KiB/s wr, 37 op/s
Jan 23 10:20:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:09 compute-1 nova_compute[225705]: 2026-01-23 10:20:09.499 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:09 compute-1 ovn_controller[133293]: 2026-01-23T10:20:09Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d5:a8:9e 10.100.0.8
Jan 23 10:20:09 compute-1 ovn_controller[133293]: 2026-01-23T10:20:09Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d5:a8:9e 10.100.0.8
Jan 23 10:20:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:20:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:09.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:20:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1702015841' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1764227055' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:09 compute-1 ceph-mon[80126]: pgmap v806: 353 pgs: 353 active+clean; 92 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 588 KiB/s wr, 15 op/s
Jan 23 10:20:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040025a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:10.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:11 compute-1 ceph-mon[80126]: pgmap v807: 353 pgs: 353 active+clean; 92 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 588 KiB/s wr, 15 op/s
Jan 23 10:20:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00047b0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:11 compute-1 nova_compute[225705]: 2026-01-23 10:20:11.427 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:11.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:12.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040025a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:12 compute-1 nova_compute[225705]: 2026-01-23 10:20:12.824 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:13 compute-1 ceph-mon[80126]: Health check update: 2 OSD(s) experiencing slow operations in BlueStore (BLUESTORE_SLOW_OP_ALERT)
Jan 23 10:20:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000033s ======
Jan 23 10:20:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:13.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000033s
Jan 23 10:20:14 compute-1 ceph-mon[80126]: pgmap v808: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 327 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:20:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00047d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:14.551 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040025a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:15 compute-1 ceph-mon[80126]: pgmap v809: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 299 KiB/s rd, 2.1 MiB/s wr, 55 op/s
Jan 23 10:20:15 compute-1 nova_compute[225705]: 2026-01-23 10:20:15.866 225709 INFO nova.compute.manager [None req-f566c29b-e1ca-4f38-a548-e7924f179629 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Get console output
Jan 23 10:20:15 compute-1 nova_compute[225705]: 2026-01-23 10:20:15.878 230072 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 10:20:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:15.914 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:16 compute-1 nova_compute[225705]: 2026-01-23 10:20:16.430 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:16.555 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00047f0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:16 compute-1 podman[231322]: 2026-01-23 10:20:16.657982764 +0000 UTC m=+0.058690186 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 10:20:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:17 compute-1 ceph-mon[80126]: pgmap v810: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 300 KiB/s rd, 2.1 MiB/s wr, 56 op/s
Jan 23 10:20:17 compute-1 nova_compute[225705]: 2026-01-23 10:20:17.828 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:17.918 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:18.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:18 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:20:18.564 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:20:18 compute-1 nova_compute[225705]: 2026-01-23 10:20:18.565 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:18 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:20:18.565 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:20:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:18 compute-1 nova_compute[225705]: 2026-01-23 10:20:18.678 225709 DEBUG nova.compute.manager [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:20:18 compute-1 nova_compute[225705]: 2026-01-23 10:20:18.678 225709 DEBUG nova.compute.manager [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing instance network info cache due to event network-changed-5def28f3-3bf5-4f1f-8e37-51794dbddfc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:20:18 compute-1 nova_compute[225705]: 2026-01-23 10:20:18.679 225709 DEBUG oslo_concurrency.lockutils [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:20:18 compute-1 nova_compute[225705]: 2026-01-23 10:20:18.679 225709 DEBUG oslo_concurrency.lockutils [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:20:18 compute-1 nova_compute[225705]: 2026-01-23 10:20:18.680 225709 DEBUG nova.network.neutron [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Refreshing network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:20:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004810 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:19 compute-1 ceph-mon[80126]: pgmap v811: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 296 KiB/s rd, 1.6 MiB/s wr, 48 op/s
Jan 23 10:20:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:19.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:19 compute-1 nova_compute[225705]: 2026-01-23 10:20:19.974 225709 DEBUG nova.network.neutron [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated VIF entry in instance network info cache for port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:20:19 compute-1 nova_compute[225705]: 2026-01-23 10:20:19.974 225709 DEBUG nova.network.neutron [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:20:19 compute-1 nova_compute[225705]: 2026-01-23 10:20:19.996 225709 DEBUG oslo_concurrency.lockutils [req-ea6b864f-d39d-4607-a5c6-56dd86938c13 req-afad342b-e91a-4847-b1ac-1ef8f489dd74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:20:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:20.562 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:20:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:21 compute-1 nova_compute[225705]: 2026-01-23 10:20:21.463 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:21.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:22 compute-1 ceph-mon[80126]: pgmap v812: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 296 KiB/s rd, 1.6 MiB/s wr, 48 op/s
Jan 23 10:20:22 compute-1 sudo[231346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:20:22 compute-1 sudo[231346]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:22 compute-1 sudo[231346]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:22 compute-1 sudo[231371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 10:20:22 compute-1 sudo[231371]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 10:20:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:22.566 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 10:20:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:22 compute-1 sudo[231371]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:22 compute-1 nova_compute[225705]: 2026-01-23 10:20:22.831 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:23 compute-1 sudo[231417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:20:23 compute-1 sudo[231417]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:23 compute-1 sudo[231417]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:23 compute-1 sudo[231442]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:20:23 compute-1 sudo[231442]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:23 compute-1 sudo[231442]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:23 compute-1 sudo[231500]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:20:23 compute-1 sudo[231500]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:23 compute-1 sudo[231500]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:23 compute-1 sudo[231525]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ceph-volume --fsid f3005f84-239a-55b6-a948-8f1fb592b920 -- inventory --format=json-pretty --filter-for-batch
Jan 23 10:20:23 compute-1 sudo[231525]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:23.924 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:23 compute-1 ceph-mon[80126]: pgmap v813: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 296 KiB/s rd, 1.6 MiB/s wr, 48 op/s
Jan 23 10:20:23 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:23 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:24 compute-1 sudo[231564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:20:24 compute-1 sudo[231564]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:24 compute-1 sudo[231564]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:24 compute-1 podman[231615]: 2026-01-23 10:20:24.292301086 +0000 UTC m=+0.038717115 container create 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.build-date=20250325, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>)
Jan 23 10:20:24 compute-1 systemd[1]: Started libpod-conmon-87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30.scope.
Jan 23 10:20:24 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:20:24 compute-1 podman[231615]: 2026-01-23 10:20:24.274907066 +0000 UTC m=+0.021323105 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:20:24 compute-1 podman[231615]: 2026-01-23 10:20:24.385340007 +0000 UTC m=+0.131756046 container init 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.40.1, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default)
Jan 23 10:20:24 compute-1 podman[231615]: 2026-01-23 10:20:24.397811801 +0000 UTC m=+0.144227810 container start 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.license=GPLv2, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325)
Jan 23 10:20:24 compute-1 podman[231615]: 2026-01-23 10:20:24.401523619 +0000 UTC m=+0.147939638 container attach 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:20:24 compute-1 zen_blackburn[231631]: 167 167
Jan 23 10:20:24 compute-1 systemd[1]: libpod-87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30.scope: Deactivated successfully.
Jan 23 10:20:24 compute-1 conmon[231631]: conmon 87cd8b0a9f0eb6a87a8e <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30.scope/container/memory.events
Jan 23 10:20:24 compute-1 podman[231615]: 2026-01-23 10:20:24.413534349 +0000 UTC m=+0.159950398 container died 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 10:20:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-91478f3a342789530622a48e8c1e4e0cdb64828e1319f68d0c8403600efe1b51-merged.mount: Deactivated successfully.
Jan 23 10:20:24 compute-1 podman[231615]: 2026-01-23 10:20:24.45631955 +0000 UTC m=+0.202735569 container remove 87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=zen_blackburn, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True)
Jan 23 10:20:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:24 compute-1 systemd[1]: libpod-conmon-87cd8b0a9f0eb6a87a8e6a6920cb8ac36ad327d23f25db937bda572afcf85d30.scope: Deactivated successfully.
Jan 23 10:20:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:24.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:24 compute-1 podman[231654]: 2026-01-23 10:20:24.624997372 +0000 UTC m=+0.041304936 container create 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 10:20:24 compute-1 systemd[1]: Started libpod-conmon-1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613.scope.
Jan 23 10:20:24 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:20:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Jan 23 10:20:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:20:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Jan 23 10:20:24 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Jan 23 10:20:24 compute-1 podman[231654]: 2026-01-23 10:20:24.606988094 +0000 UTC m=+0.023295678 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:20:24 compute-1 podman[231654]: 2026-01-23 10:20:24.711238948 +0000 UTC m=+0.127546522 container init 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:20:24 compute-1 podman[231654]: 2026-01-23 10:20:24.717421564 +0000 UTC m=+0.133729148 container start 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, ceph=True, org.label-schema.build-date=20250325, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62)
Jan 23 10:20:24 compute-1 podman[231654]: 2026-01-23 10:20:24.721873085 +0000 UTC m=+0.138180659 container attach 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:20:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:25 compute-1 focused_villani[231670]: [
Jan 23 10:20:25 compute-1 focused_villani[231670]:     {
Jan 23 10:20:25 compute-1 focused_villani[231670]:         "available": false,
Jan 23 10:20:25 compute-1 focused_villani[231670]:         "being_replaced": false,
Jan 23 10:20:25 compute-1 focused_villani[231670]:         "ceph_device_lvm": false,
Jan 23 10:20:25 compute-1 focused_villani[231670]:         "device_id": "QEMU_DVD-ROM_QM00001",
Jan 23 10:20:25 compute-1 focused_villani[231670]:         "lsm_data": {},
Jan 23 10:20:25 compute-1 focused_villani[231670]:         "lvs": [],
Jan 23 10:20:25 compute-1 focused_villani[231670]:         "path": "/dev/sr0",
Jan 23 10:20:25 compute-1 focused_villani[231670]:         "rejected_reasons": [
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "Insufficient space (<5GB)",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "Has a FileSystem"
Jan 23 10:20:25 compute-1 focused_villani[231670]:         ],
Jan 23 10:20:25 compute-1 focused_villani[231670]:         "sys_api": {
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "actuators": null,
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "device_nodes": [
Jan 23 10:20:25 compute-1 focused_villani[231670]:                 "sr0"
Jan 23 10:20:25 compute-1 focused_villani[231670]:             ],
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "devname": "sr0",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "human_readable_size": "482.00 KB",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "id_bus": "ata",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "model": "QEMU DVD-ROM",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "nr_requests": "2",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "parent": "/dev/sr0",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "partitions": {},
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "path": "/dev/sr0",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "removable": "1",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "rev": "2.5+",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "ro": "0",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "rotational": "1",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "sas_address": "",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "sas_device_handle": "",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "scheduler_mode": "mq-deadline",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "sectors": 0,
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "sectorsize": "2048",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "size": 493568.0,
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "support_discard": "2048",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "type": "disk",
Jan 23 10:20:25 compute-1 focused_villani[231670]:             "vendor": "QEMU"
Jan 23 10:20:25 compute-1 focused_villani[231670]:         }
Jan 23 10:20:25 compute-1 focused_villani[231670]:     }
Jan 23 10:20:25 compute-1 focused_villani[231670]: ]
Jan 23 10:20:25 compute-1 systemd[1]: libpod-1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613.scope: Deactivated successfully.
Jan 23 10:20:25 compute-1 podman[231654]: 2026-01-23 10:20:25.571530303 +0000 UTC m=+0.987837897 container died 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0)
Jan 23 10:20:25 compute-1 systemd[1]: var-lib-containers-storage-overlay-886a09d2ea6fbe4bdddeef7e119c6cf3d9b4d4261b2318ff6f70d0442df9411b-merged.mount: Deactivated successfully.
Jan 23 10:20:25 compute-1 podman[231654]: 2026-01-23 10:20:25.62267265 +0000 UTC m=+1.038980254 container remove 1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=focused_villani, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:20:25 compute-1 systemd[1]: libpod-conmon-1bae1041fec75ec65fcf31344519bb9a94f3b70fde05c932af3a74bca9371613.scope: Deactivated successfully.
Jan 23 10:20:25 compute-1 sudo[231525]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:25 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:25 compute-1 ceph-mon[80126]: pgmap v814: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 15 KiB/s wr, 0 op/s
Jan 23 10:20:25 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:25 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4205134929' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:25.928 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:26 compute-1 nova_compute[225705]: 2026-01-23 10:20:26.466 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:26.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:27 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:20:27.568 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:20:27 compute-1 nova_compute[225705]: 2026-01-23 10:20:27.835 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:27 compute-1 ceph-mon[80126]: pgmap v815: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 16 KiB/s wr, 1 op/s
Jan 23 10:20:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:27.931 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:28.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:29 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:29 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:29 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:20:29 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:20:29 compute-1 ceph-mon[80126]: pgmap v816: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:20:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:29.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:20:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:20:30 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:20:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:30.580 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:31 compute-1 ceph-mon[80126]: pgmap v817: 353 pgs: 353 active+clean; 121 MiB data, 298 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 3.3 KiB/s wr, 0 op/s
Jan 23 10:20:31 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/733879018' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:20:31 compute-1 nova_compute[225705]: 2026-01-23 10:20:31.467 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:31.937 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1524953951' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:20:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:32.583 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:32 compute-1 nova_compute[225705]: 2026-01-23 10:20:32.837 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:33 compute-1 ceph-mon[80126]: pgmap v818: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:20:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:33.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:34 compute-1 sudo[233044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:20:34 compute-1 sudo[233044]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:34 compute-1 sudo[233044]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:34 compute-1 podman[233068]: 2026-01-23 10:20:34.251350076 +0000 UTC m=+0.081979213 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 10:20:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_22] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:34 compute-1 sshd-session[233039]: Connection reset by 205.210.31.225 port 58482 [preauth]
Jan 23 10:20:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:34.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:20:35 compute-1 ceph-mon[80126]: pgmap v819: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:20:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_24] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004830 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:35.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:20:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800bcf0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:36 compute-1 nova_compute[225705]: 2026-01-23 10:20:36.523 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:20:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:36.586 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:20:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:37 compute-1 ceph-mon[80126]: pgmap v820: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:20:37 compute-1 nova_compute[225705]: 2026-01-23 10:20:37.840 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:37.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00048c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14001ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:38.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:39 compute-1 ceph-mon[80126]: pgmap v821: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:20:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:39.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00048e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:40.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14001ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:41 compute-1 nova_compute[225705]: 2026-01-23 10:20:41.525 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:41 compute-1 ceph-mon[80126]: pgmap v822: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:20:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:41.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:42.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:42 compute-1 nova_compute[225705]: 2026-01-23 10:20:42.844 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:43 compute-1 ceph-mon[80126]: pgmap v823: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:20:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004900 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:43.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:44 compute-1 sudo[233104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:20:44 compute-1 sudo[233104]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:20:44 compute-1 sudo[233104]: pam_unix(sudo:session): session closed for user root
Jan 23 10:20:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14001ff0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:44.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:45 compute-1 ceph-mon[80126]: pgmap v824: 353 pgs: 353 active+clean; 167 MiB data, 319 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 15 KiB/s wr, 74 op/s
Jan 23 10:20:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:45.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:46 compute-1 nova_compute[225705]: 2026-01-23 10:20:46.527 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004920 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:46.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:20:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Cumulative writes: 4988 writes, 27K keys, 4988 commit groups, 1.0 writes per commit group, ingest: 0.07 GB, 0.04 MB/s
                                           Cumulative WAL: 4988 writes, 4988 syncs, 1.00 writes per sync, written: 0.07 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1476 writes, 7220 keys, 1476 commit groups, 1.0 writes per commit group, ingest: 17.08 MB, 0.03 MB/s
                                           Interval WAL: 1476 writes, 1476 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     53.2      0.71              0.12        14    0.050       0      0       0.0       0.0
                                             L6      1/0   12.23 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   4.3    120.5    104.0      1.54              0.46        13    0.119     68K   6777       0.0       0.0
                                            Sum      1/0   12.23 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.3     82.7     88.1      2.25              0.57        27    0.083     68K   6777       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   7.3     88.6     87.9      0.83              0.20        10    0.083     29K   2602       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   0.0    120.5    104.0      1.54              0.46        13    0.119     68K   6777       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     53.4      0.70              0.12        13    0.054       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1800.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.037, interval 0.010
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.19 GB write, 0.11 MB/s write, 0.18 GB read, 0.10 MB/s read, 2.3 seconds
                                           Interval compaction: 0.07 GB write, 0.12 MB/s write, 0.07 GB read, 0.12 MB/s read, 0.8 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 13.52 MB table_size: 0 occupancy: 18446744073709551615 collections: 4 last_copies: 0 last_secs: 0.000233 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(719,12.97 MB,4.26751%) FilterBlock(27,201.17 KB,0.064624%) IndexBlock(27,355.48 KB,0.114195%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:20:47 compute-1 podman[233131]: 2026-01-23 10:20:47.68502592 +0000 UTC m=+0.081290801 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:20:47 compute-1 ceph-mon[80126]: pgmap v825: 353 pgs: 353 active+clean; 192 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 120 op/s
Jan 23 10:20:47 compute-1 nova_compute[225705]: 2026-01-23 10:20:47.847 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:47.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004940 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:48.604 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2314219766' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:20:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2314219766' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:20:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:49.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:50 compute-1 ceph-mon[80126]: pgmap v826: 353 pgs: 353 active+clean; 192 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 261 KiB/s rd, 2.0 MiB/s wr, 46 op/s
Jan 23 10:20:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:20:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:50.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:51 compute-1 ceph-mon[80126]: pgmap v827: 353 pgs: 353 active+clean; 192 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 261 KiB/s rd, 2.0 MiB/s wr, 46 op/s
Jan 23 10:20:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004960 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:51 compute-1 nova_compute[225705]: 2026-01-23 10:20:51.530 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:51.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:52.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:52 compute-1 nova_compute[225705]: 2026-01-23 10:20:52.849 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:53 compute-1 ceph-mon[80126]: pgmap v828: 353 pgs: 353 active+clean; 200 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 63 op/s
Jan 23 10:20:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:53.982 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004980 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004980 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:54.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:20:55.048 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:20:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:20:55.049 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:20:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:20:55.050 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:20:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:55 compute-1 ceph-mon[80126]: pgmap v829: 353 pgs: 353 active+clean; 200 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 325 KiB/s rd, 2.1 MiB/s wr, 62 op/s
Jan 23 10:20:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:55.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:56 compute-1 nova_compute[225705]: 2026-01-23 10:20:56.531 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:20:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:56.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:20:56 compute-1 nova_compute[225705]: 2026-01-23 10:20:56.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:56 compute-1 nova_compute[225705]: 2026-01-23 10:20:56.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:20:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:57 compute-1 ceph-mon[80126]: pgmap v830: 353 pgs: 353 active+clean; 121 MiB data, 341 MiB used, 60 GiB / 60 GiB avail; 341 KiB/s rd, 2.1 MiB/s wr, 85 op/s
Jan 23 10:20:57 compute-1 nova_compute[225705]: 2026-01-23 10:20:57.852 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:20:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000064s ======
Jan 23 10:20:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:57.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000064s
Jan 23 10:20:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:20:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00049a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:20:58.622 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:20:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1700923100' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:20:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:20:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:20:59 compute-1 ceph-mon[80126]: pgmap v831: 353 pgs: 353 active+clean; 121 MiB data, 341 MiB used, 60 GiB / 60 GiB avail; 79 KiB/s rd, 105 KiB/s wr, 39 op/s
Jan 23 10:20:59 compute-1 nova_compute[225705]: 2026-01-23 10:20:59.888 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:20:59 compute-1 nova_compute[225705]: 2026-01-23 10:20:59.888 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:20:59 compute-1 nova_compute[225705]: 2026-01-23 10:20:59.888 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:20:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:20:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:20:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:20:59.991 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.062 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.063 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.063 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.063 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.261 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.262 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.262 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.262 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.262 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.264 225709 INFO nova.compute.manager [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Terminating instance
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.265 225709 DEBUG nova.compute.manager [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 10:21:00 compute-1 kernel: tap5def28f3-3b (unregistering): left promiscuous mode
Jan 23 10:21:00 compute-1 NetworkManager[48978]: <info>  [1769163660.3267] device (tap5def28f3-3b): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:21:00 compute-1 ovn_controller[133293]: 2026-01-23T10:21:00Z|00053|binding|INFO|Releasing lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 from this chassis (sb_readonly=0)
Jan 23 10:21:00 compute-1 ovn_controller[133293]: 2026-01-23T10:21:00Z|00054|binding|INFO|Setting lport 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 down in Southbound
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.343 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-1 ovn_controller[133293]: 2026-01-23T10:21:00Z|00055|binding|INFO|Removing iface tap5def28f3-3b ovn-installed in OVS
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.345 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.355 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d5:a8:9e 10.100.0.8'], port_security=['fa:16:3e:d5:a8:9e 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-13f52d23-9898-43a0-a951-b69cb2abebab', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '077cd29b-8d1e-4ab1-b762-8cd58191c522', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=30316727-f942-4d99-94ec-26d1184b5c8a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=5def28f3-3bf5-4f1f-8e37-51794dbddfc6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.357 143098 INFO neutron.agent.ovn.metadata.agent [-] Port 5def28f3-3bf5-4f1f-8e37-51794dbddfc6 in datapath 13f52d23-9898-43a0-a951-b69cb2abebab unbound from our chassis
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.359 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 13f52d23-9898-43a0-a951-b69cb2abebab, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.360 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[27fe0fd6-a959-4371-b64e-50c3e31fd1f8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.361 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab namespace which is not needed anymore
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.370 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Deactivated successfully.
Jan 23 10:21:00 compute-1 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000004.scope: Consumed 15.270s CPU time.
Jan 23 10:21:00 compute-1 systemd-machined[194551]: Machine qemu-2-instance-00000004 terminated.
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.500 225709 INFO nova.virt.libvirt.driver [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Instance destroyed successfully.
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.500 225709 DEBUG nova.objects.instance [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:21:00 compute-1 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [NOTICE]   (231195) : haproxy version is 2.8.14-c23fe91
Jan 23 10:21:00 compute-1 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [NOTICE]   (231195) : path to executable is /usr/sbin/haproxy
Jan 23 10:21:00 compute-1 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [WARNING]  (231195) : Exiting Master process...
Jan 23 10:21:00 compute-1 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [WARNING]  (231195) : Exiting Master process...
Jan 23 10:21:00 compute-1 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [ALERT]    (231195) : Current worker (231198) exited with code 143 (Terminated)
Jan 23 10:21:00 compute-1 neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab[231191]: [WARNING]  (231195) : All workers exited. Exiting... (0)
Jan 23 10:21:00 compute-1 systemd[1]: libpod-4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e.scope: Deactivated successfully.
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.520 225709 DEBUG nova.virt.libvirt.vif [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:19:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-263648847',display_name='tempest-TestNetworkBasicOps-server-263648847',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-263648847',id=4,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD66+yyfhusLfImf1uKQDIRs2V4/o1F8Eh/Z0SqxwU6ND9wcYC22x/WBjiGE3hyxp/MbA2OwCQVvqoeXerAOWd4tdGub7VFIxVXpqt6OghLL3nU7rH27QJ0mug8wzMNOTA==',key_name='tempest-TestNetworkBasicOps-501443179',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:19:50Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-rvp4p09d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:19:50Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.520 225709 DEBUG nova.network.os_vif_util [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:21:00 compute-1 podman[233182]: 2026-01-23 10:21:00.521299888 +0000 UTC m=+0.052057956 container died 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.521 225709 DEBUG nova.network.os_vif_util [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.522 225709 DEBUG os_vif [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.524 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.524 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5def28f3-3b, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.527 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.528 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.531 225709 INFO os_vif [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d5:a8:9e,bridge_name='br-int',has_traffic_filtering=True,id=5def28f3-3bf5-4f1f-8e37-51794dbddfc6,network=Network(13f52d23-9898-43a0-a951-b69cb2abebab),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap5def28f3-3b')
Jan 23 10:21:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:00 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e-userdata-shm.mount: Deactivated successfully.
Jan 23 10:21:00 compute-1 systemd[1]: var-lib-containers-storage-overlay-200b86ca9cf8069eb92d9df0e5692d0577d96f4ecb60bc72d1a62929a70d50cc-merged.mount: Deactivated successfully.
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.559 225709 DEBUG nova.compute.manager [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-unplugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.560 225709 DEBUG oslo_concurrency.lockutils [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.560 225709 DEBUG oslo_concurrency.lockutils [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.560 225709 DEBUG oslo_concurrency.lockutils [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.561 225709 DEBUG nova.compute.manager [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] No waiting events found dispatching network-vif-unplugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.561 225709 DEBUG nova.compute.manager [req-42e67bfc-86de-4368-bcc5-35f0b536b75a req-2de3d64a-6abe-4592-a168-a593e261face 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-unplugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 10:21:00 compute-1 podman[233182]: 2026-01-23 10:21:00.565329159 +0000 UTC m=+0.096087227 container cleanup 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 10:21:00 compute-1 systemd[1]: libpod-conmon-4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e.scope: Deactivated successfully.
Jan 23 10:21:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:00.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:00 compute-1 podman[233240]: 2026-01-23 10:21:00.628726264 +0000 UTC m=+0.042783073 container remove 4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.634 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[14dbe623-67bb-422e-a20d-79807d6a3f87]: (4, ('Fri Jan 23 10:21:00 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab (4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e)\n4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e\nFri Jan 23 10:21:00 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab (4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e)\n4cc0adf8d1a628a64257300b488b99ba640101d8d2a4b1d287bffcf86ec2453e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.636 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[fbbef6ed-f6c7-4c36-8a2b-e829bc3c7b15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.637 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap13f52d23-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.638 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-1 kernel: tap13f52d23-90: left promiscuous mode
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.650 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.653 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3a334f-e4f8-4abb-8404-41ffc775ead1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.661 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:21:00 compute-1 nova_compute[225705]: 2026-01-23 10:21:00.662 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.666 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[62384201-185a-43ef-8cd0-6ebaf080cd07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.666 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[60a288ee-9752-48a0-a52a-bce75fc509d4]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.686 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[798a3144-5d55-4f2b-81f8-dbcef46b1798]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 474373, 'reachable_time': 34542, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233257, 'error': None, 'target': 'ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.688 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-13f52d23-9898-43a0-a951-b69cb2abebab deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:21:00 compute-1 systemd[1]: run-netns-ovnmeta\x2d13f52d23\x2d9898\x2d43a0\x2da951\x2db69cb2abebab.mount: Deactivated successfully.
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.688 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[48b40954-aaff-4177-807f-e325e7879b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:00.690 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:21:01 compute-1 nova_compute[225705]: 2026-01-23 10:21:01.130 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [{"id": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "address": "fa:16:3e:d5:a8:9e", "network": {"id": "13f52d23-9898-43a0-a951-b69cb2abebab", "bridge": "br-int", "label": "tempest-network-smoke--2142663020", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap5def28f3-3b", "ovs_interfaceid": "5def28f3-3bf5-4f1f-8e37-51794dbddfc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:21:01 compute-1 nova_compute[225705]: 2026-01-23 10:21:01.149 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:21:01 compute-1 nova_compute[225705]: 2026-01-23 10:21:01.149 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 10:21:01 compute-1 nova_compute[225705]: 2026-01-23 10:21:01.149 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:01 compute-1 nova_compute[225705]: 2026-01-23 10:21:01.149 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:21:01 compute-1 nova_compute[225705]: 2026-01-23 10:21:01.168 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:21:01 compute-1 nova_compute[225705]: 2026-01-23 10:21:01.168 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00049c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102101 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:21:01 compute-1 nova_compute[225705]: 2026-01-23 10:21:01.533 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:01 compute-1 ceph-mon[80126]: pgmap v832: 353 pgs: 353 active+clean; 121 MiB data, 341 MiB used, 60 GiB / 60 GiB avail; 79 KiB/s rd, 105 KiB/s wr, 39 op/s
Jan 23 10:21:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:01.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.159 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.193 225709 INFO nova.virt.libvirt.driver [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Deleting instance files /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_del
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.195 225709 INFO nova.virt.libvirt.driver [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Deletion of /var/lib/nova/instances/87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334_del complete
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.253 225709 INFO nova.compute.manager [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Took 1.99 seconds to destroy the instance on the hypervisor.
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.254 225709 DEBUG oslo.service.loopingcall [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.254 225709 DEBUG nova.compute.manager [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.255 225709 DEBUG nova.network.neutron [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 10:21:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:02.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.641 225709 DEBUG nova.compute.manager [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.642 225709 DEBUG oslo_concurrency.lockutils [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.642 225709 DEBUG oslo_concurrency.lockutils [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.643 225709 DEBUG oslo_concurrency.lockutils [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.643 225709 DEBUG nova.compute.manager [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] No waiting events found dispatching network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.644 225709 WARNING nova.compute.manager [req-87e6de03-1aae-4e3c-9d60-6f20b85fa92d req-4c51d144-6727-42a4-9cfc-4ec82b9664c1 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received unexpected event network-vif-plugged-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 for instance with vm_state active and task_state deleting.
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.679 225709 DEBUG nova.network.neutron [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.694 225709 INFO nova.compute.manager [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Took 0.44 seconds to deallocate network for instance.
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.735 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.736 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.740 225709 DEBUG nova.compute.manager [req-a3079a84-5d7e-43be-b418-60a417efb7f0 req-07dec829-c93f-4e3f-8157-f2ee5df7755e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Received event network-vif-deleted-5def28f3-3bf5-4f1f-8e37-51794dbddfc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:21:02 compute-1 nova_compute[225705]: 2026-01-23 10:21:02.785 225709 DEBUG oslo_concurrency.processutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:21:03 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/215747545' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004140 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:03 compute-1 nova_compute[225705]: 2026-01-23 10:21:03.261 225709 DEBUG oslo_concurrency.processutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.476s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:03 compute-1 nova_compute[225705]: 2026-01-23 10:21:03.268 225709 DEBUG nova.compute.provider_tree [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:21:03 compute-1 nova_compute[225705]: 2026-01-23 10:21:03.393 225709 DEBUG nova.scheduler.client.report [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:21:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:03 compute-1 nova_compute[225705]: 2026-01-23 10:21:03.757 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.021s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:03 compute-1 ceph-mon[80126]: pgmap v833: 353 pgs: 353 active+clean; 121 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 86 KiB/s rd, 105 KiB/s wr, 48 op/s
Jan 23 10:21:03 compute-1 nova_compute[225705]: 2026-01-23 10:21:03.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:03 compute-1 nova_compute[225705]: 2026-01-23 10:21:03.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:03.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:04 compute-1 nova_compute[225705]: 2026-01-23 10:21:04.275 225709 INFO nova.scheduler.client.report [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334
Jan 23 10:21:04 compute-1 sudo[233283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:21:04 compute-1 sudo[233283]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:04 compute-1 sudo[233283]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef00049e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:04 compute-1 podman[233307]: 2026-01-23 10:21:04.567305363 +0000 UTC m=+0.139027226 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Jan 23 10:21:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:04.631 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:04 compute-1 nova_compute[225705]: 2026-01-23 10:21:04.750 225709 DEBUG oslo_concurrency.lockutils [None req-0e440208-c784-4be1-ba29-9d79404ae9d5 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.488s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3874868156' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/215747545' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faeec001230 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:05 compute-1 nova_compute[225705]: 2026-01-23 10:21:05.557 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:05 compute-1 nova_compute[225705]: 2026-01-23 10:21:05.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:06.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:06 compute-1 ceph-mon[80126]: pgmap v834: 353 pgs: 353 active+clean; 121 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 12 KiB/s wr, 31 op/s
Jan 23 10:21:06 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2548080809' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.535 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_27] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004160 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:06.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.868 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.882 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.883 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.883 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.900 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.901 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:21:06 compute-1 nova_compute[225705]: 2026-01-23 10:21:06.902 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040028e0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:21:07 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3840207814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:07 compute-1 nova_compute[225705]: 2026-01-23 10:21:07.374 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:07 compute-1 nova_compute[225705]: 2026-01-23 10:21:07.585 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:21:07 compute-1 nova_compute[225705]: 2026-01-23 10:21:07.587 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4880MB free_disk=59.94269943237305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:21:07 compute-1 nova_compute[225705]: 2026-01-23 10:21:07.587 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:07 compute-1 nova_compute[225705]: 2026-01-23 10:21:07.587 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:07 compute-1 nova_compute[225705]: 2026-01-23 10:21:07.682 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:21:07 compute-1 nova_compute[225705]: 2026-01-23 10:21:07.682 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:21:07 compute-1 nova_compute[225705]: 2026-01-23 10:21:07.740 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1928470914' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:07 compute-1 ceph-mon[80126]: pgmap v835: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 39 KiB/s rd, 14 KiB/s wr, 56 op/s
Jan 23 10:21:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/731693582' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:08.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:21:08 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3507747902' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:08 compute-1 nova_compute[225705]: 2026-01-23 10:21:08.195 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:08 compute-1 nova_compute[225705]: 2026-01-23 10:21:08.203 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:21:08 compute-1 nova_compute[225705]: 2026-01-23 10:21:08.219 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:21:08 compute-1 nova_compute[225705]: 2026-01-23 10:21:08.242 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:21:08 compute-1 nova_compute[225705]: 2026-01-23 10:21:08.242 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.655s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:08.637 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:09 compute-1 nova_compute[225705]: 2026-01-23 10:21:09.235 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:09 compute-1 nova_compute[225705]: 2026-01-23 10:21:09.236 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:21:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3840207814' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3507747902' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:10.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:10 compute-1 nova_compute[225705]: 2026-01-23 10:21:10.560 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:21:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:10.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:21:10 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:10.691 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:11 compute-1 ceph-mon[80126]: pgmap v836: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 1.7 KiB/s wr, 34 op/s
Jan 23 10:21:11 compute-1 nova_compute[225705]: 2026-01-23 10:21:11.539 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:11 compute-1 nova_compute[225705]: 2026-01-23 10:21:11.587 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:11 compute-1 nova_compute[225705]: 2026-01-23 10:21:11.677 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:12.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:21:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:21:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:12.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:12 compute-1 ceph-mon[80126]: pgmap v837: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 1.7 KiB/s wr, 34 op/s
Jan 23 10:21:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:13 compute-1 ceph-mon[80126]: pgmap v838: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 24 KiB/s rd, 2.3 KiB/s wr, 35 op/s
Jan 23 10:21:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:14.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:14.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:15 compute-1 nova_compute[225705]: 2026-01-23 10:21:15.498 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163660.4970114, 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:21:15 compute-1 nova_compute[225705]: 2026-01-23 10:21:15.499 225709 INFO nova.compute.manager [-] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] VM Stopped (Lifecycle Event)
Jan 23 10:21:15 compute-1 nova_compute[225705]: 2026-01-23 10:21:15.521 225709 DEBUG nova.compute.manager [None req-3d57e49d-c759-4808-b57a-b5615e9d1214 - - - - - -] [instance: 87fb4628-ac8f-45e8-a3b9-ddfc1bcd4334] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:21:15 compute-1 nova_compute[225705]: 2026-01-23 10:21:15.564 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:21:15 compute-1 ceph-mon[80126]: pgmap v839: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.7 KiB/s wr, 26 op/s
Jan 23 10:21:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:16.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:16 compute-1 nova_compute[225705]: 2026-01-23 10:21:16.542 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:16.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:18.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:18 compute-1 ceph-mon[80126]: pgmap v840: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.2 KiB/s wr, 28 op/s
Jan 23 10:21:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:18.653 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:18 compute-1 podman[233389]: 2026-01-23 10:21:18.669434094 +0000 UTC m=+0.074705493 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:21:19 compute-1 ceph-mon[80126]: pgmap v841: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:21:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0004180 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:20.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:21:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:20 compute-1 nova_compute[225705]: 2026-01-23 10:21:20.609 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:20.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:20 compute-1 nova_compute[225705]: 2026-01-23 10:21:20.775 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:21:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102121 (4) : Server backend/nfs.cephfs.2 is UP, reason: Layer4 check passed, check duration: 1ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:21:21 compute-1 nova_compute[225705]: 2026-01-23 10:21:21.545 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:21 compute-1 ceph-mon[80126]: pgmap v842: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:21:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:22.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:22.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:23 compute-1 ceph-mon[80126]: pgmap v843: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 2.2 KiB/s rd, 1023 B/s wr, 3 op/s
Jan 23 10:21:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:24.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:24 compute-1 sudo[233410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:21:24 compute-1 sudo[233410]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:24 compute-1 sudo[233410]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0004a00 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_26] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee00041a0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:24.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_19] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:25 compute-1 nova_compute[225705]: 2026-01-23 10:21:25.614 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:25 compute-1 ceph-mon[80126]: pgmap v844: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:21:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:26.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:26 compute-1 nova_compute[225705]: 2026-01-23 10:21:26.547 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:26.667 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:27 compute-1 ceph-mon[80126]: pgmap v845: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 1.3 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:21:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_13] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:28.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:28.671 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001670 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:29 compute-1 ceph-mon[80126]: pgmap v846: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:21:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:30.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040011d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:30 compute-1 nova_compute[225705]: 2026-01-23 10:21:30.616 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:30.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:31 compute-1 nova_compute[225705]: 2026-01-23 10:21:31.549 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:31 compute-1 ceph-mon[80126]: pgmap v847: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:21:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:32.039 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.140 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.141 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.164 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.235 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.236 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.248 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.248 225709 INFO nova.compute.claims [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Claim successful on node compute-1.ctlplane.example.com
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.372 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001670 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040011d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:32.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:21:32 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3733854137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.877 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.504s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:32 compute-1 nova_compute[225705]: 2026-01-23 10:21:32.884 225709 DEBUG nova.compute.provider_tree [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.099 225709 DEBUG nova.scheduler.client.report [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.127 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.891s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.128 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.175 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.176 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.197 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.215 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 10:21:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.297 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.299 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.299 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Creating image(s)
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.332 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.366 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.396 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.402 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.490 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.492 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.493 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.494 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.525 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.532 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:33 compute-1 nova_compute[225705]: 2026-01-23 10:21:33.561 225709 DEBUG nova.policy [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:21:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:33 compute-1 ceph-mon[80126]: pgmap v848: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:21:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:34.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:34 compute-1 sudo[233555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:21:34 compute-1 sudo[233555]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:34 compute-1 sudo[233555]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:34 compute-1 sudo[233580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:21:34 compute-1 sudo[233580]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001670 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:34.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:34 compute-1 sudo[233580]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:35 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3733854137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:21:35 compute-1 nova_compute[225705]: 2026-01-23 10:21:35.618 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:35 compute-1 podman[233640]: 2026-01-23 10:21:35.715745761 +0000 UTC m=+0.112697263 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:21:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:36.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:36 compute-1 nova_compute[225705]: 2026-01-23 10:21:36.551 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:36 compute-1 nova_compute[225705]: 2026-01-23 10:21:36.620 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Successfully created port: dfaa68a5-31a2-4de5-996e-11936357ca9b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:21:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:36.682 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8001810 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:37 compute-1 ceph-mon[80126]: pgmap v849: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:21:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:21:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:38.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.222 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 4.691s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.293 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.498 225709 DEBUG nova.objects.instance [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.512 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.513 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Ensure instance console log exists: /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.513 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.513 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.514 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:38.686 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.727 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Successfully updated port: dfaa68a5-31a2-4de5-996e-11936357ca9b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.756 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.757 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.757 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:21:38 compute-1 ceph-mon[80126]: pgmap v850: 353 pgs: 353 active+clean; 62 MiB data, 257 MiB used, 60 GiB / 60 GiB avail; 8.4 KiB/s rd, 873 KiB/s wr, 14 op/s
Jan 23 10:21:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:38 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.889 225709 DEBUG nova.compute.manager [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.890 225709 DEBUG nova.compute.manager [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:21:38 compute-1 nova_compute[225705]: 2026-01-23 10:21:38.891 225709 DEBUG oslo_concurrency.lockutils [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:21:39 compute-1 nova_compute[225705]: 2026-01-23 10:21:39.124 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 10:21:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:40.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:40 compute-1 ceph-mon[80126]: pgmap v851: 353 pgs: 353 active+clean; 84 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 8.5 KiB/s rd, 1.5 MiB/s wr, 16 op/s
Jan 23 10:21:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:21:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:21:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:21:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:21:40 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.367 225709 DEBUG nova.network.neutron [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.399 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.400 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance network_info: |[{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.400 225709 DEBUG oslo_concurrency.lockutils [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.401 225709 DEBUG nova.network.neutron [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.405 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start _get_guest_xml network_info=[{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encrypted': False, 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encryption_format': None, 'boot_index': 0, 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.411 225709 WARNING nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.421 225709 DEBUG nova.virt.libvirt.host [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.422 225709 DEBUG nova.virt.libvirt.host [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.425 225709 DEBUG nova.virt.libvirt.host [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.426 225709 DEBUG nova.virt.libvirt.host [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.426 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.426 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.427 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.427 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.427 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.428 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.429 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.429 225709 DEBUG nova.virt.hardware [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.431 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.622 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:40.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:21:40 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1204880216' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.895 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.930 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:21:40 compute-1 nova_compute[225705]: 2026-01-23 10:21:40.936 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4002eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:41 compute-1 ceph-mon[80126]: pgmap v852: 353 pgs: 353 active+clean; 84 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 8.5 KiB/s rd, 1.5 MiB/s wr, 16 op/s
Jan 23 10:21:41 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1204880216' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:21:41 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:21:41 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1226134693' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.421 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.423 225709 DEBUG nova.virt.libvirt.vif [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:21:33Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.423 225709 DEBUG nova.network.os_vif_util [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.424 225709 DEBUG nova.network.os_vif_util [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.425 225709 DEBUG nova.objects.instance [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.441 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] End _get_guest_xml xml=<domain type="kvm">
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <name>instance-00000006</name>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <memory>131072</memory>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <vcpu>1</vcpu>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <nova:creationTime>2026-01-23 10:21:40</nova:creationTime>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <nova:flavor name="m1.nano">
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <nova:memory>128</nova:memory>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <nova:disk>1</nova:disk>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <nova:swap>0</nova:swap>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <nova:vcpus>1</nova:vcpus>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       </nova:flavor>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <nova:owner>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       </nova:owner>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <nova:ports>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 10:21:41 compute-1 nova_compute[225705]:           <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         </nova:port>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       </nova:ports>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     </nova:instance>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <sysinfo type="smbios">
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <system>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <entry name="manufacturer">RDO</entry>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <entry name="product">OpenStack Compute</entry>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <entry name="serial">db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <entry name="uuid">db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <entry name="family">Virtual Machine</entry>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     </system>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <os>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <boot dev="hd"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <smbios mode="sysinfo"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   </os>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <features>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <vmcoreinfo/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   </features>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <clock offset="utc">
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <timer name="hpet" present="no"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <cpu mode="host-model" match="exact">
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <disk type="network" device="disk">
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk">
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       </source>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <target dev="vda" bus="virtio"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <disk type="network" device="cdrom">
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config">
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       </source>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:21:41 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <target dev="sda" bus="sata"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <interface type="ethernet">
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <mac address="fa:16:3e:b7:90:a0"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <mtu size="1442"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <target dev="tapdfaa68a5-31"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <serial type="pty">
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <log file="/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log" append="off"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <video>
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     </video>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <input type="tablet" bus="usb"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <rng model="virtio">
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <backend model="random">/dev/urandom</backend>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <controller type="usb" index="0"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     <memballoon model="virtio">
Jan 23 10:21:41 compute-1 nova_compute[225705]:       <stats period="10"/>
Jan 23 10:21:41 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:21:41 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:21:41 compute-1 nova_compute[225705]: </domain>
Jan 23 10:21:41 compute-1 nova_compute[225705]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.443 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Preparing to wait for external event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.443 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.443 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.444 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.444 225709 DEBUG nova.virt.libvirt.vif [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:21:33Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.444 225709 DEBUG nova.network.os_vif_util [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.445 225709 DEBUG nova.network.os_vif_util [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.445 225709 DEBUG os_vif [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.446 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.447 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.447 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.452 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.452 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdfaa68a5-31, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.453 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapdfaa68a5-31, col_values=(('external_ids', {'iface-id': 'dfaa68a5-31a2-4de5-996e-11936357ca9b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b7:90:a0', 'vm-uuid': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:41 compute-1 NetworkManager[48978]: <info>  [1769163701.4973] manager: (tapdfaa68a5-31): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.499 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.507 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.508 225709 INFO os_vif [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31')
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.552 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.579 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.580 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.580 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:b7:90:a0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.581 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Using config drive
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.613 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.815 225709 DEBUG nova.network.neutron [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.816 225709 DEBUG nova.network.neutron [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.842 225709 DEBUG oslo_concurrency.lockutils [req-1db95e66-0994-411f-898c-eed50d900dda req-ee752d5e-1696-410f-930c-180500f41017 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.966 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Creating config drive at /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config
Jan 23 10:21:41 compute-1 nova_compute[225705]: 2026-01-23 10:21:41.973 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcuy4rb25 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:42.052 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:42 compute-1 nova_compute[225705]: 2026-01-23 10:21:42.115 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcuy4rb25" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:42 compute-1 nova_compute[225705]: 2026-01-23 10:21:42.163 225709 DEBUG nova.storage.rbd_utils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:21:42 compute-1 nova_compute[225705]: 2026-01-23 10:21:42.167 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:21:42 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1226134693' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:21:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:42.695 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.292 225709 DEBUG oslo_concurrency.processutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.294 225709 INFO nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Deleting local config drive /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/disk.config because it was imported into RBD.
Jan 23 10:21:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:43 compute-1 kernel: tapdfaa68a5-31: entered promiscuous mode
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.354 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-1 ovn_controller[133293]: 2026-01-23T10:21:43Z|00056|binding|INFO|Claiming lport dfaa68a5-31a2-4de5-996e-11936357ca9b for this chassis.
Jan 23 10:21:43 compute-1 ovn_controller[133293]: 2026-01-23T10:21:43Z|00057|binding|INFO|dfaa68a5-31a2-4de5-996e-11936357ca9b: Claiming fa:16:3e:b7:90:a0 10.100.0.11
Jan 23 10:21:43 compute-1 NetworkManager[48978]: <info>  [1769163703.3576] manager: (tapdfaa68a5-31): new Tun device (/org/freedesktop/NetworkManager/Devices/42)
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.360 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.362 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.380 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:90:a0 10.100.0.11'], port_security=['fa:16:3e:b7:90:a0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8afa87f8-5b22-4350-8bf2-c7af019c3372', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dbc1781-4648-4570-b3c6-0353674ab246, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=dfaa68a5-31a2-4de5-996e-11936357ca9b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.382 143098 INFO neutron.agent.ovn.metadata.agent [-] Port dfaa68a5-31a2-4de5-996e-11936357ca9b in datapath eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 bound to our chassis
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.383 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eae9e618-a7c2-43e9-ab46-9070ca2ef7f2
Jan 23 10:21:43 compute-1 systemd-udevd[233875]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:21:43 compute-1 systemd-machined[194551]: New machine qemu-3-instance-00000006.
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.405 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3c00033c-26f9-4652-8664-d33119a758c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.409 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeae9e618-a1 in ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:21:43 compute-1 NetworkManager[48978]: <info>  [1769163703.4137] device (tapdfaa68a5-31): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.414 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeae9e618-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.414 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[99a09d0a-c8e2-486a-b180-33c3213e5799]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 NetworkManager[48978]: <info>  [1769163703.4151] device (tapdfaa68a5-31): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.415 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e580114e-78e5-40e7-a2ec-b8077166c36d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.425 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-1 ovn_controller[133293]: 2026-01-23T10:21:43Z|00058|binding|INFO|Setting lport dfaa68a5-31a2-4de5-996e-11936357ca9b ovn-installed in OVS
Jan 23 10:21:43 compute-1 ovn_controller[133293]: 2026-01-23T10:21:43Z|00059|binding|INFO|Setting lport dfaa68a5-31a2-4de5-996e-11936357ca9b up in Southbound
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.430 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.430 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[592bc69a-2f50-46e3-9902-7d3654d8c842]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.444 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[cb87e7b0-fed1-4a9c-9439-8e72a659474b]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.470 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[6da9db94-9cc8-4632-9a79-b8c02ecefa30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.475 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[ce6ecdcd-ac60-4e49-9457-0314817d6176]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 NetworkManager[48978]: <info>  [1769163703.4764] manager: (tapeae9e618-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/43)
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.508 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[4b295205-bf2e-481e-a525-a70f7f83e392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.510 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[9c679fba-caa9-438f-b3b4-b76d780856f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ceph-mon[80126]: pgmap v853: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:21:43 compute-1 NetworkManager[48978]: <info>  [1769163703.5329] device (tapeae9e618-a0): carrier: link connected
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.538 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[3aab1e74-4f5e-4b4b-9bd1-702cfa636b09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.558 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[183d3b62-d34e-4c3b-b334-cae7766024eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeae9e618-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:bc:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485751, 'reachable_time': 35285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 233910, 'error': None, 'target': 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.575 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e171f94a-5372-4239-91c9-cc966e0512fa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feee:bca2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 485751, 'tstamp': 485751}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 233911, 'error': None, 'target': 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.594 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[2e62c60b-001e-4044-97a4-3c186aa51f28]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeae9e618-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ee:bc:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485751, 'reachable_time': 35285, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 233912, 'error': None, 'target': 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.622 225709 DEBUG nova.compute.manager [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.624 225709 DEBUG oslo_concurrency.lockutils [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.624 225709 DEBUG oslo_concurrency.lockutils [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.624 225709 DEBUG oslo_concurrency.lockutils [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.624 225709 DEBUG nova.compute.manager [req-4eb6c836-d8c8-4cc8-9394-54f14836a56a req-db6583e7-53db-4974-af14-6ca8a149e222 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Processing event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.624 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[391a2213-0d07-4662-93bb-afd44685f18a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.679 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[4a2d609e-27e8-42bc-96ca-a283c4bec2cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.681 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeae9e618-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.681 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.682 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeae9e618-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:43 compute-1 kernel: tapeae9e618-a0: entered promiscuous mode
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.683 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-1 NetworkManager[48978]: <info>  [1769163703.6854] manager: (tapeae9e618-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.686 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.687 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeae9e618-a0, col_values=(('external_ids', {'iface-id': 'f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:21:43 compute-1 ovn_controller[133293]: 2026-01-23T10:21:43Z|00060|binding|INFO|Releasing lport f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba from this chassis (sb_readonly=0)
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.688 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-1 nova_compute[225705]: 2026-01-23 10:21:43.701 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.702 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eae9e618-a7c2-43e9-ab46-9070ca2ef7f2.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eae9e618-a7c2-43e9-ab46-9070ca2ef7f2.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.703 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[b6671725-223a-4350-8c25-a8d50d3c5e19]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.704 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: global
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     log         /dev/log local0 debug
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     log-tag     haproxy-metadata-proxy-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     user        root
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     group       root
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     maxconn     1024
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     pidfile     /var/lib/neutron/external/pids/eae9e618-a7c2-43e9-ab46-9070ca2ef7f2.pid.haproxy
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     daemon
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: defaults
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     log global
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     mode http
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     option httplog
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     option dontlognull
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     option http-server-close
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     option forwardfor
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     retries                 3
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     timeout http-request    30s
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     timeout connect         30s
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     timeout client          32s
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     timeout server          32s
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     timeout http-keep-alive 30s
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: listen listener
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     bind 169.254.169.254:80
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:     http-request add-header X-OVN-Network-ID eae9e618-a7c2-43e9-ab46-9070ca2ef7f2
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:21:43 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:43.706 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'env', 'PROCESS_TAG=haproxy-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eae9e618-a7c2-43e9-ab46-9070ca2ef7f2.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.002 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.002 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163704.0014431, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.002 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] VM Started (Lifecycle Event)
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.008 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.012 225709 INFO nova.virt.libvirt.driver [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance spawned successfully.
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.013 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.022 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.025 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.033 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.034 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.034 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.034 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.035 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.035 225709 DEBUG nova.virt.libvirt.driver [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.042 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.043 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163704.0043259, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.043 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] VM Paused (Lifecycle Event)
Jan 23 10:21:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:21:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:44.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.065 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.069 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163704.0071929, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.070 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] VM Resumed (Lifecycle Event)
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.097 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.102 225709 INFO nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Took 10.80 seconds to spawn the instance on the hypervisor.
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.103 225709 DEBUG nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.104 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.136 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.165 225709 INFO nova.compute.manager [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Took 11.96 seconds to build instance.
Jan 23 10:21:44 compute-1 nova_compute[225705]: 2026-01-23 10:21:44.183 225709 DEBUG oslo_concurrency.lockutils [None req-5dec2ebc-485c-4329-ab67-dd5c97339edb f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.042s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:44 compute-1 podman[233985]: 2026-01-23 10:21:44.090848139 +0000 UTC m=+0.028274445 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:21:44 compute-1 sudo[233998]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:21:44 compute-1 sudo[233998]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:44 compute-1 sudo[233998]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:44 compute-1 podman[233985]: 2026-01-23 10:21:44.437090894 +0000 UTC m=+0.374517170 container create 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 10:21:44 compute-1 systemd[1]: Started libpod-conmon-77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee.scope.
Jan 23 10:21:44 compute-1 sudo[234023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:21:44 compute-1 sudo[234023]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:21:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4002eb0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:44 compute-1 sudo[234023]: pam_unix(sudo:session): session closed for user root
Jan 23 10:21:44 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:21:44 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/33d908cc39ee3158e23a819479dadd5bc8e191abccd206682925f5884fa34301/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:21:44 compute-1 podman[233985]: 2026-01-23 10:21:44.621410251 +0000 UTC m=+0.558836557 container init 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 10:21:44 compute-1 podman[233985]: 2026-01-23 10:21:44.626380558 +0000 UTC m=+0.563806834 container start 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:21:44 compute-1 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [NOTICE]   (234054) : New worker (234056) forked
Jan 23 10:21:44 compute-1 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [NOTICE]   (234054) : Loading success.
Jan 23 10:21:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:44.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:21:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:45 compute-1 nova_compute[225705]: 2026-01-23 10:21:45.803 225709 DEBUG nova.compute.manager [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:21:45 compute-1 nova_compute[225705]: 2026-01-23 10:21:45.804 225709 DEBUG oslo_concurrency.lockutils [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:45 compute-1 nova_compute[225705]: 2026-01-23 10:21:45.804 225709 DEBUG oslo_concurrency.lockutils [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:45 compute-1 nova_compute[225705]: 2026-01-23 10:21:45.804 225709 DEBUG oslo_concurrency.lockutils [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:45 compute-1 nova_compute[225705]: 2026-01-23 10:21:45.805 225709 DEBUG nova.compute.manager [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:21:45 compute-1 nova_compute[225705]: 2026-01-23 10:21:45.805 225709 WARNING nova.compute.manager [req-7f6624a5-9fec-4f45-be52-acc2b6e88616 req-4604f41f-348f-49c8-9955-c15b2b53d891 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b for instance with vm_state active and task_state None.
Jan 23 10:21:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:46.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:46 compute-1 nova_compute[225705]: 2026-01-23 10:21:46.539 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:46 compute-1 nova_compute[225705]: 2026-01-23 10:21:46.556 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.581883) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706582073, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 2403, "num_deletes": 251, "total_data_size": 6597390, "memory_usage": 6707840, "flush_reason": "Manual Compaction"}
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Jan 23 10:21:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706627096, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 4220070, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25834, "largest_seqno": 28232, "table_properties": {"data_size": 4210435, "index_size": 6065, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 20450, "raw_average_key_size": 20, "raw_value_size": 4190923, "raw_average_value_size": 4207, "num_data_blocks": 262, "num_entries": 996, "num_filter_entries": 996, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163491, "oldest_key_time": 1769163491, "file_creation_time": 1769163706, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 45255 microseconds, and 9363 cpu microseconds.
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:21:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.627187) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 4220070 bytes OK
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.627224) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.666112) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.666190) EVENT_LOG_v1 {"time_micros": 1769163706666177, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.666231) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 6586774, prev total WAL file size 6587481, number of live WAL files 2.
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.668478) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032303038' seq:72057594037927935, type:22 .. '7061786F730032323630' seq:0, type:0; will stop at (end)
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(4121KB)], [51(12MB)]
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706668684, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 17044787, "oldest_snapshot_seqno": -1}
Jan 23 10:21:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:46.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 5925 keys, 14852407 bytes, temperature: kUnknown
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706914657, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 14852407, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14811516, "index_size": 24973, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14853, "raw_key_size": 150611, "raw_average_key_size": 25, "raw_value_size": 14703001, "raw_average_value_size": 2481, "num_data_blocks": 1017, "num_entries": 5925, "num_filter_entries": 5925, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163706, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.914996) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 14852407 bytes
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.916684) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 69.3 rd, 60.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.2 +0.0 blob) out(14.2 +0.0 blob), read-write-amplify(7.6) write-amplify(3.5) OK, records in: 6448, records dropped: 523 output_compression: NoCompression
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.916700) EVENT_LOG_v1 {"time_micros": 1769163706916692, "job": 30, "event": "compaction_finished", "compaction_time_micros": 246034, "compaction_time_cpu_micros": 34322, "output_level": 6, "num_output_files": 1, "total_output_size": 14852407, "num_input_records": 6448, "num_output_records": 5925, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706917588, "job": 30, "event": "table_file_deletion", "file_number": 53}
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163706920361, "job": 30, "event": "table_file_deletion", "file_number": 51}
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.668120) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920519) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920528) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920529) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920531) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:46 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:21:46.920532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:21:47 compute-1 ovn_controller[133293]: 2026-01-23T10:21:47Z|00061|binding|INFO|Releasing lport f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba from this chassis (sb_readonly=0)
Jan 23 10:21:47 compute-1 nova_compute[225705]: 2026-01-23 10:21:47.117 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:47 compute-1 NetworkManager[48978]: <info>  [1769163707.1180] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/45)
Jan 23 10:21:47 compute-1 NetworkManager[48978]: <info>  [1769163707.1191] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Jan 23 10:21:47 compute-1 ovn_controller[133293]: 2026-01-23T10:21:47Z|00062|binding|INFO|Releasing lport f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba from this chassis (sb_readonly=0)
Jan 23 10:21:47 compute-1 nova_compute[225705]: 2026-01-23 10:21:47.174 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:47 compute-1 nova_compute[225705]: 2026-01-23 10:21:47.179 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:47 compute-1 ceph-mon[80126]: pgmap v854: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:21:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:48.061 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:48.703 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:48 compute-1 nova_compute[225705]: 2026-01-23 10:21:48.753 225709 DEBUG nova.compute.manager [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:21:48 compute-1 nova_compute[225705]: 2026-01-23 10:21:48.754 225709 DEBUG nova.compute.manager [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:21:48 compute-1 nova_compute[225705]: 2026-01-23 10:21:48.754 225709 DEBUG oslo_concurrency.lockutils [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:21:48 compute-1 nova_compute[225705]: 2026-01-23 10:21:48.755 225709 DEBUG oslo_concurrency.lockutils [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:21:48 compute-1 nova_compute[225705]: 2026-01-23 10:21:48.755 225709 DEBUG nova.network.neutron [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:21:49 compute-1 ceph-mon[80126]: pgmap v855: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 984 KiB/s rd, 1.8 MiB/s wr, 68 op/s
Jan 23 10:21:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1255014004' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:21:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1255014004' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:21:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:49 compute-1 podman[234069]: 2026-01-23 10:21:49.677621769 +0000 UTC m=+0.073611758 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Jan 23 10:21:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:50.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:21:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:50.706 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:21:51 compute-1 nova_compute[225705]: 2026-01-23 10:21:51.049 225709 DEBUG nova.network.neutron [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:21:51 compute-1 nova_compute[225705]: 2026-01-23 10:21:51.050 225709 DEBUG nova.network.neutron [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:21:51 compute-1 nova_compute[225705]: 2026-01-23 10:21:51.067 225709 DEBUG oslo_concurrency.lockutils [req-43fe8c73-a1c1-40f4-b30b-1a9c0f2aa41f req-905c1e32-0d62-40ed-8d9e-35534c27c65d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:21:51 compute-1 ceph-mon[80126]: pgmap v856: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 954 KiB/s wr, 86 op/s
Jan 23 10:21:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:21:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:51 compute-1 nova_compute[225705]: 2026-01-23 10:21:51.542 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:51 compute-1 nova_compute[225705]: 2026-01-23 10:21:51.557 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:52.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:52 compute-1 ceph-mon[80126]: pgmap v857: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 269 KiB/s wr, 85 op/s
Jan 23 10:21:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:52.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:53 compute-1 ceph-mon[80126]: pgmap v858: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 269 KiB/s wr, 85 op/s
Jan 23 10:21:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:21:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:54.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:21:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:54.713 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:55.049 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:21:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:55.050 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:21:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:21:55.051 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:21:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:55 compute-1 ceph-mon[80126]: pgmap v859: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 73 op/s
Jan 23 10:21:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:56.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:56 compute-1 nova_compute[225705]: 2026-01-23 10:21:56.545 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:56 compute-1 nova_compute[225705]: 2026-01-23 10:21:56.558 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:21:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf040033c0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:56.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:57 compute-1 ceph-mon[80126]: pgmap v860: 353 pgs: 353 active+clean; 97 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 743 KiB/s wr, 85 op/s
Jan 23 10:21:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_28] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:57 compute-1 ovn_controller[133293]: 2026-01-23T10:21:57Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b7:90:a0 10.100.0.11
Jan 23 10:21:57 compute-1 ovn_controller[133293]: 2026-01-23T10:21:57Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b7:90:a0 10.100.0.11
Jan 23 10:21:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:21:58.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102158 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:21:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:21:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:21:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:21:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:21:58.719 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:21:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:21:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:21:59 compute-1 ceph-mon[80126]: pgmap v861: 353 pgs: 353 active+clean; 108 MiB data, 294 MiB used, 60 GiB / 60 GiB avail; 1.1 MiB/s rd, 1.8 MiB/s wr, 64 op/s
Jan 23 10:22:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:00.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:00.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:01 compute-1 ceph-mon[80126]: pgmap v862: 353 pgs: 353 active+clean; 108 MiB data, 294 MiB used, 60 GiB / 60 GiB avail; 91 KiB/s rd, 1.8 MiB/s wr, 30 op/s
Jan 23 10:22:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:01 compute-1 nova_compute[225705]: 2026-01-23 10:22:01.559 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:22:01 compute-1 nova_compute[225705]: 2026-01-23 10:22:01.561 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:22:01 compute-1 nova_compute[225705]: 2026-01-23 10:22:01.562 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 23 10:22:01 compute-1 nova_compute[225705]: 2026-01-23 10:22:01.562 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:22:01 compute-1 nova_compute[225705]: 2026-01-23 10:22:01.575 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:01 compute-1 nova_compute[225705]: 2026-01-23 10:22:01.576 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:22:01 compute-1 nova_compute[225705]: 2026-01-23 10:22:01.947 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:01 compute-1 nova_compute[225705]: 2026-01-23 10:22:01.947 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:22:01 compute-1 nova_compute[225705]: 2026-01-23 10:22:01.947 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:22:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:02.081 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:02 compute-1 nova_compute[225705]: 2026-01-23 10:22:02.485 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:22:02 compute-1 nova_compute[225705]: 2026-01-23 10:22:02.485 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:22:02 compute-1 nova_compute[225705]: 2026-01-23 10:22:02.485 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 10:22:02 compute-1 nova_compute[225705]: 2026-01-23 10:22:02.486 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:22:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0003c70 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:02.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:03 compute-1 nova_compute[225705]: 2026-01-23 10:22:03.271 225709 INFO nova.compute.manager [None req-c635b348-c6e2-40be-9940-9413dfd2cffe f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Get console output
Jan 23 10:22:03 compute-1 nova_compute[225705]: 2026-01-23 10:22:03.278 230072 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 10:22:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:03 compute-1 ceph-mon[80126]: pgmap v863: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Jan 23 10:22:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:04.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:04 compute-1 nova_compute[225705]: 2026-01-23 10:22:04.094 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:22:04 compute-1 nova_compute[225705]: 2026-01-23 10:22:04.117 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:22:04 compute-1 nova_compute[225705]: 2026-01-23 10:22:04.118 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 10:22:04 compute-1 nova_compute[225705]: 2026-01-23 10:22:04.118 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:04 compute-1 sudo[234097]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:22:04 compute-1 sudo[234097]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:04 compute-1 sudo[234097]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:22:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:04.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:22:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1253421413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:04 compute-1 nova_compute[225705]: 2026-01-23 10:22:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:04 compute-1 nova_compute[225705]: 2026-01-23 10:22:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:05 compute-1 ceph-mon[80126]: pgmap v864: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 269 KiB/s rd, 2.1 MiB/s wr, 60 op/s
Jan 23 10:22:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1615457187' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:22:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:06.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.577 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.579 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.579 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.579 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.580 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.581 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:06.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:06 compute-1 podman[234123]: 2026-01-23 10:22:06.740717956 +0000 UTC m=+0.126096947 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.899 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.900 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.900 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.900 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:22:06 compute-1 nova_compute[225705]: 2026-01-23 10:22:06.901 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:07 compute-1 nova_compute[225705]: 2026-01-23 10:22:07.000 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:07 compute-1 nova_compute[225705]: 2026-01-23 10:22:07.001 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:07 compute-1 nova_compute[225705]: 2026-01-23 10:22:07.001 225709 DEBUG nova.objects.instance [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'flavor' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:22:07 compute-1 nova_compute[225705]: 2026-01-23 10:22:07.279 225709 DEBUG nova.objects.instance [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_requests' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:22:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:22:07 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/748902986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:07 compute-1 nova_compute[225705]: 2026-01-23 10:22:07.363 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:22:07 compute-1 ceph-mon[80126]: pgmap v865: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 270 KiB/s rd, 2.1 MiB/s wr, 61 op/s
Jan 23 10:22:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2351850048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/748902986' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:07 compute-1 nova_compute[225705]: 2026-01-23 10:22:07.893 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:22:07 compute-1 nova_compute[225705]: 2026-01-23 10:22:07.971 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:22:07 compute-1 nova_compute[225705]: 2026-01-23 10:22:07.972 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:22:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:08.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.177 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.178 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4699MB free_disk=59.942752838134766GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.178 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.179 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.267 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.268 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.268 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.319 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.541 225709 DEBUG nova.policy [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:22:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4003bc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:08.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:22:08 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1277093325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.800 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.807 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.831 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.882 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:22:08 compute-1 nova_compute[225705]: 2026-01-23 10:22:08.882 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.704s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1160861138' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1277093325' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:09 compute-1 nova_compute[225705]: 2026-01-23 10:22:09.207 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Successfully created port: d372b816-e400-40ea-9e6b-cc8c21e54bc6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:22:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_30] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:09 compute-1 nova_compute[225705]: 2026-01-23 10:22:09.883 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:09 compute-1 nova_compute[225705]: 2026-01-23 10:22:09.884 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:09 compute-1 nova_compute[225705]: 2026-01-23 10:22:09.884 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:09 compute-1 nova_compute[225705]: 2026-01-23 10:22:09.885 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:22:09 compute-1 nova_compute[225705]: 2026-01-23 10:22:09.886 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:22:09 compute-1 ceph-mon[80126]: pgmap v866: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 249 KiB/s rd, 1.4 MiB/s wr, 49 op/s
Jan 23 10:22:10 compute-1 nova_compute[225705]: 2026-01-23 10:22:10.031 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Successfully updated port: d372b816-e400-40ea-9e6b-cc8c21e54bc6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:22:10 compute-1 nova_compute[225705]: 2026-01-23 10:22:10.050 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:22:10 compute-1 nova_compute[225705]: 2026-01-23 10:22:10.051 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:22:10 compute-1 nova_compute[225705]: 2026-01-23 10:22:10.051 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:22:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:10.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:10 compute-1 nova_compute[225705]: 2026-01-23 10:22:10.145 225709 DEBUG nova.compute.manager [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:22:10 compute-1 nova_compute[225705]: 2026-01-23 10:22:10.146 225709 DEBUG nova.compute.manager [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-d372b816-e400-40ea-9e6b-cc8c21e54bc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:22:10 compute-1 nova_compute[225705]: 2026-01-23 10:22:10.147 225709 DEBUG oslo_concurrency.lockutils [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:22:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_29] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:22:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:22:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:10.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:11 compute-1 ceph-mon[80126]: pgmap v867: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 179 KiB/s rd, 339 KiB/s wr, 30 op/s
Jan 23 10:22:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:11 compute-1 nova_compute[225705]: 2026-01-23 10:22:11.581 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:12.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.562 225709 DEBUG nova.network.neutron [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.582 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.583 225709 DEBUG oslo_concurrency.lockutils [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.584 225709 DEBUG nova.network.neutron [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port d372b816-e400-40ea-9e6b-cc8c21e54bc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.590 225709 DEBUG nova.virt.libvirt.vif [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.591 225709 DEBUG nova.network.os_vif_util [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.592 225709 DEBUG nova.network.os_vif_util [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.593 225709 DEBUG os_vif [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.594 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.595 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.596 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.605 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.605 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd372b816-e4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.606 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd372b816-e4, col_values=(('external_ids', {'iface-id': 'd372b816-e400-40ea-9e6b-cc8c21e54bc6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:02:7d:ea', 'vm-uuid': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.609 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:12 compute-1 NetworkManager[48978]: <info>  [1769163732.6109] manager: (tapd372b816-e4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Jan 23 10:22:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.613 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.621 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.623 225709 INFO os_vif [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4')
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.625 225709 DEBUG nova.virt.libvirt.vif [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.625 225709 DEBUG nova.network.os_vif_util [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.627 225709 DEBUG nova.network.os_vif_util [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.630 225709 DEBUG nova.virt.libvirt.guest [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] attach device xml: <interface type="ethernet">
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <mac address="fa:16:3e:02:7d:ea"/>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <model type="virtio"/>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <mtu size="1442"/>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <target dev="tapd372b816-e4"/>
Jan 23 10:22:12 compute-1 nova_compute[225705]: </interface>
Jan 23 10:22:12 compute-1 nova_compute[225705]:  attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339
Jan 23 10:22:12 compute-1 kernel: tapd372b816-e4: entered promiscuous mode
Jan 23 10:22:12 compute-1 NetworkManager[48978]: <info>  [1769163732.6487] manager: (tapd372b816-e4): new Tun device (/org/freedesktop/NetworkManager/Devices/48)
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.650 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:12 compute-1 ovn_controller[133293]: 2026-01-23T10:22:12Z|00063|binding|INFO|Claiming lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 for this chassis.
Jan 23 10:22:12 compute-1 ovn_controller[133293]: 2026-01-23T10:22:12Z|00064|binding|INFO|d372b816-e400-40ea-9e6b-cc8c21e54bc6: Claiming fa:16:3e:02:7d:ea 10.100.0.20
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.663 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:7d:ea 10.100.0.20'], port_security=['fa:16:3e:02:7d:ea 10.100.0.20'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a09a282-aa22-47cf-a68d-ce0dba493868', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '41f899d0-e5bc-43b7-808c-efb54f22dad4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dee54ab-ce3c-4b4e-ac76-15d1824a947d, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=d372b816-e400-40ea-9e6b-cc8c21e54bc6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.664 143098 INFO neutron.agent.ovn.metadata.agent [-] Port d372b816-e400-40ea-9e6b-cc8c21e54bc6 in datapath 6a09a282-aa22-47cf-a68d-ce0dba493868 bound to our chassis
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.666 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.682 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2987cd-91f3-48d3-bc10-886d8aa2f72a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.684 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap6a09a282-a1 in ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.685 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap6a09a282-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.686 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[db07c222-d1a2-449a-849d-3ef7aa92bd80]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.687 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c2064f7f-fc77-4a6a-8950-322230437231]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002af0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.695 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:12 compute-1 ovn_controller[133293]: 2026-01-23T10:22:12Z|00065|binding|INFO|Setting lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 ovn-installed in OVS
Jan 23 10:22:12 compute-1 ovn_controller[133293]: 2026-01-23T10:22:12Z|00066|binding|INFO|Setting lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 up in Southbound
Jan 23 10:22:12 compute-1 systemd-udevd[234206]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.706 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.705 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[34d01c08-8f95-4df9-8928-dc76510a2a79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 NetworkManager[48978]: <info>  [1769163732.7301] device (tapd372b816-e4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:22:12 compute-1 NetworkManager[48978]: <info>  [1769163732.7308] device (tapd372b816-e4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.734 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e5ff1263-c9db-47db-8f21-c9ec94a08799]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:12.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.764 225709 DEBUG nova.virt.libvirt.driver [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.765 225709 DEBUG nova.virt.libvirt.driver [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.765 225709 DEBUG nova.virt.libvirt.driver [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:b7:90:a0, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.766 225709 DEBUG nova.virt.libvirt.driver [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:02:7d:ea, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.770 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[e820c2e7-dc0e-4068-8ade-795becf7a924]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 NetworkManager[48978]: <info>  [1769163732.7769] manager: (tap6a09a282-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/49)
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.776 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[4d2c722a-99aa-4559-9a65-98b4d2f8b60d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.795 225709 DEBUG nova.virt.libvirt.guest [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:22:12</nova:creationTime>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:22:12 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 10:22:12 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     <nova:port uuid="d372b816-e400-40ea-9e6b-cc8c21e54bc6">
Jan 23 10:22:12 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 23 10:22:12 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:22:12 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:22:12 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:22:12 compute-1 nova_compute[225705]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.819 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c135aba3-16ca-4414-9431-35c263a34d16]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.822 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[d063cf37-dfad-4c10-989e-c4afbbc728cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 nova_compute[225705]: 2026-01-23 10:22:12.826 225709 DEBUG oslo_concurrency.lockutils [None req-3171f94b-a947-43f3-b389-f063f6717aaf f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 5.825s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:12 compute-1 NetworkManager[48978]: <info>  [1769163732.8486] device (tap6a09a282-a0): carrier: link connected
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.853 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[0cf0ad97-7666-4eef-8272-f06c832404ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.871 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[85df82a3-f20b-41f7-ab4f-1fab18b4116a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a09a282-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:9b:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488683, 'reachable_time': 26349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234231, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.894 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[231545f6-db11-436c-b63b-90daae1d3198]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee9:9ba3'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 488683, 'tstamp': 488683}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 234232, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.919 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[fe9c42c9-107e-41d1-8310-f499ee3f10fd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap6a09a282-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e9:9b:a3'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488683, 'reachable_time': 26349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 234233, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:12 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:12.961 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[ab509599-f2e0-4648-b061-710b6b68f5f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.040 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[ba85d32a-5df3-4dc0-91c7-e08c6dd76f01]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.043 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a09a282-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.043 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.044 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6a09a282-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:13 compute-1 NetworkManager[48978]: <info>  [1769163733.0480] manager: (tap6a09a282-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Jan 23 10:22:13 compute-1 kernel: tap6a09a282-a0: entered promiscuous mode
Jan 23 10:22:13 compute-1 nova_compute[225705]: 2026-01-23 10:22:13.047 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:13 compute-1 nova_compute[225705]: 2026-01-23 10:22:13.051 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.052 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap6a09a282-a0, col_values=(('external_ids', {'iface-id': 'f3eaa8c6-94ad-445d-ab48-59e26f30c078'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:13 compute-1 ovn_controller[133293]: 2026-01-23T10:22:13Z|00067|binding|INFO|Releasing lport f3eaa8c6-94ad-445d-ab48-59e26f30c078 from this chassis (sb_readonly=0)
Jan 23 10:22:13 compute-1 nova_compute[225705]: 2026-01-23 10:22:13.054 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:13 compute-1 nova_compute[225705]: 2026-01-23 10:22:13.080 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:13 compute-1 nova_compute[225705]: 2026-01-23 10:22:13.084 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.085 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.086 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7097759a-b434-4c3e-89ba-bc5122c140d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.088 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: global
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     log         /dev/log local0 debug
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     log-tag     haproxy-metadata-proxy-6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     user        root
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     group       root
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     maxconn     1024
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     pidfile     /var/lib/neutron/external/pids/6a09a282-aa22-47cf-a68d-ce0dba493868.pid.haproxy
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     daemon
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: defaults
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     log global
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     mode http
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     option httplog
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     option dontlognull
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     option http-server-close
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     option forwardfor
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     retries                 3
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     timeout http-request    30s
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     timeout connect         30s
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     timeout client          32s
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     timeout server          32s
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     timeout http-keep-alive 30s
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: listen listener
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     bind 169.254.169.254:80
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:     http-request add-header X-OVN-Network-ID 6a09a282-aa22-47cf-a68d-ce0dba493868
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:22:13 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:13.089 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'env', 'PROCESS_TAG=haproxy-6a09a282-aa22-47cf-a68d-ce0dba493868', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/6a09a282-aa22-47cf-a68d-ce0dba493868.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:22:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:13 compute-1 podman[234266]: 2026-01-23 10:22:13.519780464 +0000 UTC m=+0.067680610 container create 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 10:22:13 compute-1 systemd[1]: Started libpod-conmon-30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315.scope.
Jan 23 10:22:13 compute-1 podman[234266]: 2026-01-23 10:22:13.48390461 +0000 UTC m=+0.031804846 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:22:13 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:22:13 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44d1fb66475eda2601ea35994a86f5afc04dbdd9f74d7699ab0867a140ff5c5c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:22:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:13 compute-1 podman[234266]: 2026-01-23 10:22:13.636313288 +0000 UTC m=+0.184213454 container init 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 10:22:13 compute-1 podman[234266]: 2026-01-23 10:22:13.646240332 +0000 UTC m=+0.194140518 container start 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:22:13 compute-1 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [NOTICE]   (234286) : New worker (234288) forked
Jan 23 10:22:13 compute-1 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [NOTICE]   (234286) : Loading success.
Jan 23 10:22:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:22:13 compute-1 ceph-mon[80126]: pgmap v868: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 180 KiB/s rd, 342 KiB/s wr, 32 op/s
Jan 23 10:22:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:14.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:14 compute-1 ovn_controller[133293]: 2026-01-23T10:22:14Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:02:7d:ea 10.100.0.20
Jan 23 10:22:14 compute-1 ovn_controller[133293]: 2026-01-23T10:22:14Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:02:7d:ea 10.100.0.20
Jan 23 10:22:14 compute-1 nova_compute[225705]: 2026-01-23 10:22:14.602 225709 DEBUG nova.compute.manager [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:22:14 compute-1 nova_compute[225705]: 2026-01-23 10:22:14.603 225709 DEBUG oslo_concurrency.lockutils [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:14 compute-1 nova_compute[225705]: 2026-01-23 10:22:14.603 225709 DEBUG oslo_concurrency.lockutils [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:14 compute-1 nova_compute[225705]: 2026-01-23 10:22:14.603 225709 DEBUG oslo_concurrency.lockutils [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:14 compute-1 nova_compute[225705]: 2026-01-23 10:22:14.603 225709 DEBUG nova.compute.manager [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:22:14 compute-1 nova_compute[225705]: 2026-01-23 10:22:14.604 225709 WARNING nova.compute.manager [req-9635108a-d624-41bc-bcd3-f6166285a76a req-7237f7bf-730b-499a-9d5c-155096feb7d8 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 for instance with vm_state active and task_state None.
Jan 23 10:22:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:14.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14002af0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:15 compute-1 ceph-mon[80126]: pgmap v869: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 15 KiB/s wr, 3 op/s
Jan 23 10:22:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:16.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:16 compute-1 nova_compute[225705]: 2026-01-23 10:22:16.296 225709 DEBUG nova.network.neutron [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port d372b816-e400-40ea-9e6b-cc8c21e54bc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:22:16 compute-1 nova_compute[225705]: 2026-01-23 10:22:16.297 225709 DEBUG nova.network.neutron [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:22:16 compute-1 nova_compute[225705]: 2026-01-23 10:22:16.311 225709 DEBUG oslo_concurrency.lockutils [req-4d043b96-ad63-46d9-acfa-d9d635979a60 req-2ba5612a-2eb8-4ed8-9a46-f0260c34b25c 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:22:16 compute-1 nova_compute[225705]: 2026-01-23 10:22:16.584 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:16.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:17 compute-1 nova_compute[225705]: 2026-01-23 10:22:17.024 225709 DEBUG nova.compute.manager [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:22:17 compute-1 nova_compute[225705]: 2026-01-23 10:22:17.024 225709 DEBUG oslo_concurrency.lockutils [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:17 compute-1 nova_compute[225705]: 2026-01-23 10:22:17.025 225709 DEBUG oslo_concurrency.lockutils [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:17 compute-1 nova_compute[225705]: 2026-01-23 10:22:17.025 225709 DEBUG oslo_concurrency.lockutils [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:17 compute-1 nova_compute[225705]: 2026-01-23 10:22:17.025 225709 DEBUG nova.compute.manager [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:22:17 compute-1 nova_compute[225705]: 2026-01-23 10:22:17.025 225709 WARNING nova.compute.manager [req-d8a811e3-7ad3-4a9e-ac27-cec7069c067e req-e6b4f6e1-e52d-4c9f-a52a-308e17f5f63e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 for instance with vm_state active and task_state None.
Jan 23 10:22:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:17 compute-1 nova_compute[225705]: 2026-01-23 10:22:17.610 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:17 compute-1 ceph-mon[80126]: pgmap v870: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 16 KiB/s wr, 4 op/s
Jan 23 10:22:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:18.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:18.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:19 compute-1 ceph-mon[80126]: pgmap v871: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 3.6 KiB/s wr, 3 op/s
Jan 23 10:22:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:20.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102220 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:22:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:20 compute-1 podman[234300]: 2026-01-23 10:22:20.663772039 +0000 UTC m=+0.063431756 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:22:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:20.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:22:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:21 compute-1 nova_compute[225705]: 2026-01-23 10:22:21.586 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:21 compute-1 ceph-mon[80126]: pgmap v872: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.7 KiB/s rd, 3.2 KiB/s wr, 3 op/s
Jan 23 10:22:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:22.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:22 compute-1 nova_compute[225705]: 2026-01-23 10:22:22.613 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:22.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:23 compute-1 ceph-mon[80126]: pgmap v873: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 2.8 KiB/s rd, 6.6 KiB/s wr, 3 op/s
Jan 23 10:22:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1943420009' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:22:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:24.120 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:24 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:24.263 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:22:24 compute-1 nova_compute[225705]: 2026-01-23 10:22:24.264 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:24 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:24.265 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:22:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:24 compute-1 sudo[234322]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:22:24 compute-1 sudo[234322]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:24 compute-1 sudo[234322]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:24.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:25 compute-1 sshd-session[234347]: Invalid user sol from 45.148.10.240 port 33458
Jan 23 10:22:25 compute-1 sshd-session[234347]: Connection closed by invalid user sol 45.148.10.240 port 33458 [preauth]
Jan 23 10:22:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:25 compute-1 ceph-mon[80126]: pgmap v874: 353 pgs: 353 active+clean; 121 MiB data, 306 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 3.7 KiB/s wr, 1 op/s
Jan 23 10:22:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:26.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:26 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:26.268 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:22:26 compute-1 nova_compute[225705]: 2026-01-23 10:22:26.589 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:26.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:27 compute-1 ceph-mon[80126]: pgmap v875: 353 pgs: 353 active+clean; 144 MiB data, 309 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 246 KiB/s wr, 27 op/s
Jan 23 10:22:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:27 compute-1 nova_compute[225705]: 2026-01-23 10:22:27.615 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:28.126 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/815013215' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:22:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2638325462' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:22:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:28 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:28 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:28.770 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:29 compute-1 ceph-mon[80126]: pgmap v876: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:22:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:29 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:30.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:30 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:30 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:30.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:31 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:31 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:31 compute-1 nova_compute[225705]: 2026-01-23 10:22:31.592 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:31 compute-1 ceph-mon[80126]: pgmap v877: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:22:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:32.132 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:32 compute-1 nova_compute[225705]: 2026-01-23 10:22:32.618 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:32 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:32.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:33 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:33 compute-1 ceph-mon[80126]: pgmap v878: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 102 op/s
Jan 23 10:22:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:34.135 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:34 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:34 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:34.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:35 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:35 compute-1 ceph-mon[80126]: pgmap v879: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:22:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:22:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:36.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:36 compute-1 nova_compute[225705]: 2026-01-23 10:22:36.595 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:36 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:36 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:36.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:37 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:37 compute-1 nova_compute[225705]: 2026-01-23 10:22:37.621 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:37 compute-1 podman[234356]: 2026-01-23 10:22:37.693667439 +0000 UTC m=+0.096563583 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:22:37 compute-1 ceph-mon[80126]: pgmap v880: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.8 MiB/s wr, 186 op/s
Jan 23 10:22:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:38.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:38 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:38 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:38.786 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:39 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:39 compute-1 ceph-mon[80126]: pgmap v881: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 1.6 MiB/s wr, 207 op/s
Jan 23 10:22:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:40.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:40 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:40 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001fc0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:40.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:41 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:41 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_33] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:41 compute-1 nova_compute[225705]: 2026-01-23 10:22:41.598 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:41 compute-1 ceph-mon[80126]: pgmap v882: 353 pgs: 353 active+clean; 167 MiB data, 327 MiB used, 60 GiB / 60 GiB avail; 2.0 MiB/s rd, 17 KiB/s wr, 206 op/s
Jan 23 10:22:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:42.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:42 compute-1 nova_compute[225705]: 2026-01-23 10:22:42.624 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:42 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:42.792 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:43 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:43 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:43 compute-1 ceph-mon[80126]: pgmap v883: 353 pgs: 353 active+clean; 175 MiB data, 332 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 1.2 MiB/s wr, 286 op/s
Jan 23 10:22:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:44.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:44 compute-1 sudo[234386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:22:44 compute-1 sudo[234386]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:44 compute-1 sudo[234386]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:44 compute-1 sudo[234411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:22:44 compute-1 sudo[234411]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:44 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:44 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:44.795 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:44 compute-1 sudo[234450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:22:44 compute-1 sudo[234450]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:44 compute-1 sudo[234450]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:45 compute-1 sudo[234411]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:45 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:45 compute-1 ceph-mon[80126]: pgmap v884: 353 pgs: 353 active+clean; 175 MiB data, 332 MiB used, 60 GiB / 60 GiB avail; 248 KiB/s rd, 1.1 MiB/s wr, 211 op/s
Jan 23 10:22:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:22:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:22:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:22:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:22:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:22:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:22:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:22:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:46.323 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:46 compute-1 nova_compute[225705]: 2026-01-23 10:22:46.603 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:46 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:46 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8003e60 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:46.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:47 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:47 compute-1 nova_compute[225705]: 2026-01-23 10:22:47.626 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:47 compute-1 ceph-mon[80126]: pgmap v885: 353 pgs: 353 active+clean; 199 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 460 KiB/s rd, 2.1 MiB/s wr, 243 op/s
Jan 23 10:22:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:48.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8003a10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:48 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:48 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:48.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2595657408' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:22:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2595657408' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:22:49 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:49 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:49 compute-1 sudo[234497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:22:49 compute-1 sudo[234497]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:22:49 compute-1 sudo[234497]: pam_unix(sudo:session): session closed for user root
Jan 23 10:22:49 compute-1 ceph-mon[80126]: pgmap v886: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 446 KiB/s rd, 2.1 MiB/s wr, 162 op/s
Jan 23 10:22:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:22:49 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:22:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:50.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102250 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:22:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:50 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:50 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:50.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:22:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:51 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:51 compute-1 nova_compute[225705]: 2026-01-23 10:22:51.605 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:51 compute-1 podman[234523]: 2026-01-23 10:22:51.656116578 +0000 UTC m=+0.053381828 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Jan 23 10:22:51 compute-1 ceph-mon[80126]: pgmap v887: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 417 KiB/s rd, 2.1 MiB/s wr, 114 op/s
Jan 23 10:22:51 compute-1 nova_compute[225705]: 2026-01-23 10:22:51.967 225709 DEBUG nova.compute.manager [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:22:51 compute-1 nova_compute[225705]: 2026-01-23 10:22:51.968 225709 DEBUG nova.compute.manager [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-d372b816-e400-40ea-9e6b-cc8c21e54bc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:22:51 compute-1 nova_compute[225705]: 2026-01-23 10:22:51.969 225709 DEBUG oslo_concurrency.lockutils [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:22:51 compute-1 nova_compute[225705]: 2026-01-23 10:22:51.969 225709 DEBUG oslo_concurrency.lockutils [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:22:51 compute-1 nova_compute[225705]: 2026-01-23 10:22:51.969 225709 DEBUG nova.network.neutron [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port d372b816-e400-40ea-9e6b-cc8c21e54bc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:22:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:52.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:52 compute-1 nova_compute[225705]: 2026-01-23 10:22:52.629 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:52 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:52.812 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:52 compute-1 ceph-mon[80126]: pgmap v888: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 417 KiB/s rd, 2.2 MiB/s wr, 115 op/s
Jan 23 10:22:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:53 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:53 compute-1 nova_compute[225705]: 2026-01-23 10:22:53.691 225709 DEBUG nova.network.neutron [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port d372b816-e400-40ea-9e6b-cc8c21e54bc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:22:53 compute-1 nova_compute[225705]: 2026-01-23 10:22:53.691 225709 DEBUG nova.network.neutron [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:22:53 compute-1 nova_compute[225705]: 2026-01-23 10:22:53.713 225709 DEBUG oslo_concurrency.lockutils [req-629b28aa-eab6-43e0-9020-cc10bf861988 req-f099aa98-ff0d-460d-ab53-cfcbfbf4ea8d 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:22:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:54.338 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:54 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:54.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:55.051 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:22:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:55.052 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:22:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:22:55.053 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:22:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:55 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee8001f90 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:55 compute-1 ceph-mon[80126]: pgmap v889: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 253 KiB/s rd, 1.0 MiB/s wr, 35 op/s
Jan 23 10:22:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:56.341 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:56 compute-1 nova_compute[225705]: 2026-01-23 10:22:56.607 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:56 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:22:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:56.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:22:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:57 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:57 compute-1 nova_compute[225705]: 2026-01-23 10:22:57.633 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:22:57 compute-1 ceph-mon[80126]: pgmap v890: 353 pgs: 353 active+clean; 200 MiB data, 353 MiB used, 60 GiB / 60 GiB avail; 253 KiB/s rd, 1.0 MiB/s wr, 35 op/s
Jan 23 10:22:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:22:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:22:58.345 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:22:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:22:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:58 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:22:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:22:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:22:58.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:22:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:22:59 compute-1 ceph-mon[80126]: pgmap v891: 353 pgs: 353 active+clean; 173 MiB data, 330 MiB used, 60 GiB / 60 GiB avail; 43 KiB/s rd, 18 KiB/s wr, 6 op/s
Jan 23 10:22:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:22:59 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:23:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:00.348 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:00 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:00.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:01 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:01 compute-1 nova_compute[225705]: 2026-01-23 10:23:01.609 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:01 compute-1 ceph-mon[80126]: pgmap v892: 353 pgs: 353 active+clean; 173 MiB data, 330 MiB used, 60 GiB / 60 GiB avail; 1.1 KiB/s rd, 17 KiB/s wr, 3 op/s
Jan 23 10:23:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2480691247' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.310 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-d372b816-e400-40ea-9e6b-cc8c21e54bc6" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.311 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-d372b816-e400-40ea-9e6b-cc8c21e54bc6" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.328 225709 DEBUG nova.objects.instance [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'flavor' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.349 225709 DEBUG nova.virt.libvirt.vif [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.350 225709 DEBUG nova.network.os_vif_util [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.350 225709 DEBUG nova.network.os_vif_util [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:23:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:02.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.355 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.359 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.362 225709 DEBUG nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Attempting to detach device tapd372b816-e4 from instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.363 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] detach device xml: <interface type="ethernet">
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <mac address="fa:16:3e:02:7d:ea"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <model type="virtio"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <mtu size="1442"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <target dev="tapd372b816-e4"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]: </interface>
Jan 23 10:23:02 compute-1 nova_compute[225705]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.370 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.374 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <name>instance-00000006</name>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:22:12</nova:creationTime>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:port uuid="d372b816-e400-40ea-9e6b-cc8c21e54bc6">
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:23:02 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <memory unit='KiB'>131072</memory>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <vcpu placement='static'>1</vcpu>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <resource>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <partition>/machine</partition>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </resource>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <sysinfo type='smbios'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <system>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='manufacturer'>RDO</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='serial'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='uuid'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='family'>Virtual Machine</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </system>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <os>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <boot dev='hd'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <smbios mode='sysinfo'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </os>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <features>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <vmcoreinfo state='on'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </features>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <vendor>AMD</vendor>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='x2apic'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc-deadline'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='hypervisor'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc_adjust'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='spec-ctrl'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='stibp'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='ssbd'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='cmp_legacy'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='overflow-recov'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='succor'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='ibrs'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='amd-ssbd'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='virt-ssbd'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='lbrv'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='tsc-scale'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='vmcb-clean'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='flushbyasid'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='pause-filter'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='pfthreshold'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='xsaves'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='svm'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='topoext'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='npt'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='nrip-save'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <clock offset='utc'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <timer name='hpet' present='no'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <on_poweroff>destroy</on_poweroff>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <on_reboot>restart</on_reboot>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <on_crash>destroy</on_crash>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <disk type='network' device='disk'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk' index='2'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </source>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target dev='vda' bus='virtio'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='virtio-disk0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <disk type='network' device='cdrom'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config' index='1'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </source>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target dev='sda' bus='sata'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <readonly/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='sata0-0-0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pcie.0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='1' port='0x10'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='2' port='0x11'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='3' port='0x12'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.3'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='4' port='0x13'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.4'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='5' port='0x14'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.5'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='6' port='0x15'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.6'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='7' port='0x16'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.7'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='8' port='0x17'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.8'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='9' port='0x18'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.9'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='10' port='0x19'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.10'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='11' port='0x1a'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.11'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='12' port='0x1b'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.12'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='13' port='0x1c'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.13'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='14' port='0x1d'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.14'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='15' port='0x1e'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.15'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='16' port='0x1f'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.16'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='17' port='0x20'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.17'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='18' port='0x21'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.18'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='19' port='0x22'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.19'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='20' port='0x23'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.20'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='21' port='0x24'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.21'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='22' port='0x25'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.22'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='23' port='0x26'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.23'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='24' port='0x27'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.24'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='25' port='0x28'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.25'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-pci-bridge'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.26'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='usb'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='sata' index='0'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='ide'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:b7:90:a0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target dev='tapdfaa68a5-31'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='net0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:02:7d:ea'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target dev='tapd372b816-e4'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='net1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <serial type='pty'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target type='isa-serial' port='0'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <model name='isa-serial'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </target>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target type='serial' port='0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </console>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <input type='tablet' bus='usb'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='input0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='usb' bus='0' port='1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <input type='mouse' bus='ps2'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='input1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <input type='keyboard' bus='ps2'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='input2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <listen type='address' address='::0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <audio id='1' type='none'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <video>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='video0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </video>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <watchdog model='itco' action='reset'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='watchdog0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </watchdog>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <memballoon model='virtio'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <stats period='10'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='balloon0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <rng model='virtio'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <backend model='random'>/dev/urandom</backend>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='rng0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <label>system_u:system_r:svirt_t:s0:c125,c442</label>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c125,c442</imagelabel>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <label>+107:+107</label>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <imagelabel>+107:+107</imagelabel>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:23:02 compute-1 nova_compute[225705]: </domain>
Jan 23 10:23:02 compute-1 nova_compute[225705]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.375 225709 INFO nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully detached device tapd372b816-e4 from instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 from the persistent domain config.
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.376 225709 DEBUG nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] (1/8): Attempting to detach device tapd372b816-e4 with device alias net1 from instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.376 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] detach device xml: <interface type="ethernet">
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <mac address="fa:16:3e:02:7d:ea"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <model type="virtio"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <mtu size="1442"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <target dev="tapd372b816-e4"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]: </interface>
Jan 23 10:23:02 compute-1 nova_compute[225705]:  detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465
Jan 23 10:23:02 compute-1 kernel: tapd372b816-e4 (unregistering): left promiscuous mode
Jan 23 10:23:02 compute-1 NetworkManager[48978]: <info>  [1769163782.4877] device (tapd372b816-e4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.500 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:02 compute-1 ovn_controller[133293]: 2026-01-23T10:23:02Z|00068|binding|INFO|Releasing lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 from this chassis (sb_readonly=0)
Jan 23 10:23:02 compute-1 ovn_controller[133293]: 2026-01-23T10:23:02Z|00069|binding|INFO|Setting lport d372b816-e400-40ea-9e6b-cc8c21e54bc6 down in Southbound
Jan 23 10:23:02 compute-1 ovn_controller[133293]: 2026-01-23T10:23:02Z|00070|binding|INFO|Removing iface tapd372b816-e4 ovn-installed in OVS
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.503 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.505 225709 DEBUG nova.virt.libvirt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Received event <DeviceRemovedEvent: 1769163782.504969, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.507 225709 DEBUG nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Start waiting for the detach event from libvirt for device tapd372b816-e4 with device alias net1 for instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.508 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.510 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:02:7d:ea 10.100.0.20', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.20/28', 'neutron:device_id': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6a09a282-aa22-47cf-a68d-ce0dba493868', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8dee54ab-ce3c-4b4e-ac76-15d1824a947d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=d372b816-e400-40ea-9e6b-cc8c21e54bc6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.512 143098 INFO neutron.agent.ovn.metadata.agent [-] Port d372b816-e400-40ea-9e6b-cc8c21e54bc6 in datapath 6a09a282-aa22-47cf-a68d-ce0dba493868 unbound from our chassis
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.513 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 6a09a282-aa22-47cf-a68d-ce0dba493868, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.515 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <name>instance-00000006</name>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:22:12</nova:creationTime>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:port uuid="d372b816-e400-40ea-9e6b-cc8c21e54bc6">
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.20" ipVersion="4"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:23:02 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <memory unit='KiB'>131072</memory>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <vcpu placement='static'>1</vcpu>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <resource>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <partition>/machine</partition>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </resource>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <sysinfo type='smbios'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <system>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='manufacturer'>RDO</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='serial'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='uuid'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <entry name='family'>Virtual Machine</entry>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </system>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <os>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <boot dev='hd'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <smbios mode='sysinfo'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </os>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <features>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <vmcoreinfo state='on'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </features>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <vendor>AMD</vendor>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='x2apic'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc-deadline'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='hypervisor'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc_adjust'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='spec-ctrl'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='stibp'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='ssbd'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='cmp_legacy'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='overflow-recov'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='succor'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='ibrs'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='amd-ssbd'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='virt-ssbd'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='lbrv'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='tsc-scale'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='vmcb-clean'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='flushbyasid'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='pause-filter'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='pfthreshold'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='xsaves'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='svm'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='require' name='topoext'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='npt'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <feature policy='disable' name='nrip-save'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <clock offset='utc'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <timer name='hpet' present='no'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <on_poweroff>destroy</on_poweroff>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <on_reboot>restart</on_reboot>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <on_crash>destroy</on_crash>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <disk type='network' device='disk'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk' index='2'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </source>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target dev='vda' bus='virtio'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='virtio-disk0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <disk type='network' device='cdrom'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config' index='1'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </source>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target dev='sda' bus='sata'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <readonly/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='sata0-0-0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pcie.0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='1' port='0x10'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='2' port='0x11'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='3' port='0x12'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.3'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='4' port='0x13'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.4'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='5' port='0x14'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.5'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='6' port='0x15'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.6'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='7' port='0x16'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.7'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='8' port='0x17'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.8'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='9' port='0x18'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.9'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='10' port='0x19'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.10'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='11' port='0x1a'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.11'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='12' port='0x1b'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.12'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='13' port='0x1c'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.13'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='14' port='0x1d'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.14'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='15' port='0x1e'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.15'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='16' port='0x1f'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.16'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='17' port='0x20'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.17'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='18' port='0x21'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.18'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='19' port='0x22'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.19'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='20' port='0x23'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.20'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='21' port='0x24'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.21'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='22' port='0x25'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.22'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='23' port='0x26'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.23'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='24' port='0x27'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.24'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target chassis='25' port='0x28'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.25'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model name='pcie-pci-bridge'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='pci.26'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='usb'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <controller type='sata' index='0'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='ide'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:b7:90:a0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target dev='tapdfaa68a5-31'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='net0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <serial type='pty'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target type='isa-serial' port='0'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:         <model name='isa-serial'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       </target>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <target type='serial' port='0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </console>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <input type='tablet' bus='usb'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='input0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='usb' bus='0' port='1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <input type='mouse' bus='ps2'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='input1'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <input type='keyboard' bus='ps2'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='input2'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <listen type='address' address='::0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <audio id='1' type='none'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <video>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='video0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </video>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <watchdog model='itco' action='reset'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='watchdog0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </watchdog>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <memballoon model='virtio'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <stats period='10'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='balloon0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <rng model='virtio'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <backend model='random'>/dev/urandom</backend>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <alias name='rng0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <label>system_u:system_r:svirt_t:s0:c125,c442</label>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c125,c442</imagelabel>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <label>+107:+107</label>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <imagelabel>+107:+107</imagelabel>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:23:02 compute-1 nova_compute[225705]: </domain>
Jan 23 10:23:02 compute-1 nova_compute[225705]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.515 225709 INFO nova.virt.libvirt.driver [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully detached device tapd372b816-e4 from instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 from the live domain config.
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.516 225709 DEBUG nova.virt.libvirt.vif [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.516 225709 DEBUG nova.network.os_vif_util [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.517 225709 DEBUG nova.network.os_vif_util [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.517 225709 DEBUG os_vif [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.516 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[9cfe9cd4-f62f-4027-9bf9-7e5ba7bb2e2f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.519 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.520 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd372b816-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.520 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 namespace which is not needed anymore
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.521 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.521 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.523 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.526 225709 INFO os_vif [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4')
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.527 225709 DEBUG nova.virt.libvirt.guest [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:23:02</nova:creationTime>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 10:23:02 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 10:23:02 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:23:02 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:23:02 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:23:02 compute-1 nova_compute[225705]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 10:23:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:02 compute-1 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [NOTICE]   (234286) : haproxy version is 2.8.14-c23fe91
Jan 23 10:23:02 compute-1 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [NOTICE]   (234286) : path to executable is /usr/sbin/haproxy
Jan 23 10:23:02 compute-1 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [WARNING]  (234286) : Exiting Master process...
Jan 23 10:23:02 compute-1 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [WARNING]  (234286) : Exiting Master process...
Jan 23 10:23:02 compute-1 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [ALERT]    (234286) : Current worker (234288) exited with code 143 (Terminated)
Jan 23 10:23:02 compute-1 neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868[234282]: [WARNING]  (234286) : All workers exited. Exiting... (0)
Jan 23 10:23:02 compute-1 systemd[1]: libpod-30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315.scope: Deactivated successfully.
Jan 23 10:23:02 compute-1 podman[234571]: 2026-01-23 10:23:02.68008732 +0000 UTC m=+0.048972678 container died 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:23:02 compute-1 systemd[1]: var-lib-containers-storage-overlay-44d1fb66475eda2601ea35994a86f5afc04dbdd9f74d7699ab0867a140ff5c5c-merged.mount: Deactivated successfully.
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.711 225709 DEBUG nova.compute.manager [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-unplugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:23:02 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315-userdata-shm.mount: Deactivated successfully.
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.712 225709 DEBUG oslo_concurrency.lockutils [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.713 225709 DEBUG oslo_concurrency.lockutils [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.713 225709 DEBUG oslo_concurrency.lockutils [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.713 225709 DEBUG nova.compute.manager [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-unplugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.713 225709 WARNING nova.compute.manager [req-08735331-99f8-4212-a78e-52fe0378726a req-6a27bde7-42f3-44b4-946c-574cf990a8da 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-unplugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 for instance with vm_state active and task_state None.
Jan 23 10:23:02 compute-1 podman[234571]: 2026-01-23 10:23:02.718942088 +0000 UTC m=+0.087827476 container cleanup 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Jan 23 10:23:02 compute-1 systemd[1]: libpod-conmon-30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315.scope: Deactivated successfully.
Jan 23 10:23:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:02 compute-1 podman[234601]: 2026-01-23 10:23:02.793480265 +0000 UTC m=+0.047473162 container remove 30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.802 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0ffad06e-1231-4312-871d-dd910e56b1cc]: (4, ('Fri Jan 23 10:23:02 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 (30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315)\n30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315\nFri Jan 23 10:23:02 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 (30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315)\n30df05e45d297f3b1d9e806f66dd2f70a4efc96233255c584b22978f2d737315\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.804 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[a5020298-920b-4b2e-99c2-eaeb005d1993]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.806 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6a09a282-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:23:02 compute-1 kernel: tap6a09a282-a0: left promiscuous mode
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.808 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:02 compute-1 nova_compute[225705]: 2026-01-23 10:23:02.827 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:02.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.831 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[77143509-cd5a-4625-b737-ff322b64cbd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.848 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e37facdb-ab54-46f6-a1e6-26e13b7fe821]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.849 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[2d026b22-a096-46ae-a4e7-e4c0aa627e89]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.873 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d89314c4-bc78-413b-8619-0d11817a08d8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 488674, 'reachable_time': 33027, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234616, 'error': None, 'target': 'ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.875 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-6a09a282-aa22-47cf-a68d-ce0dba493868 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:23:02 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:02.876 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[f1ac2f4b-eb5b-4eee-b935-e319b98de284]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:02 compute-1 systemd[1]: run-netns-ovnmeta\x2d6a09a282\x2daa22\x2d47cf\x2da68d\x2dce0dba493868.mount: Deactivated successfully.
Jan 23 10:23:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:23:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:02 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:23:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:03 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:03 compute-1 ceph-mon[80126]: pgmap v893: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 20 KiB/s wr, 30 op/s
Jan 23 10:23:03 compute-1 nova_compute[225705]: 2026-01-23 10:23:03.872 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:03 compute-1 nova_compute[225705]: 2026-01-23 10:23:03.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:03 compute-1 nova_compute[225705]: 2026-01-23 10:23:03.873 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:23:03 compute-1 nova_compute[225705]: 2026-01-23 10:23:03.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:23:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:04.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.564 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.564 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.565 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.565 225709 DEBUG nova.objects.instance [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lazy-loading 'info_cache' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:23:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:04 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:04.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.853 225709 DEBUG nova.compute.manager [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.854 225709 DEBUG oslo_concurrency.lockutils [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.854 225709 DEBUG oslo_concurrency.lockutils [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.854 225709 DEBUG oslo_concurrency.lockutils [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.854 225709 DEBUG nova.compute.manager [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:23:04 compute-1 nova_compute[225705]: 2026-01-23 10:23:04.855 225709 WARNING nova.compute.manager [req-0366537c-0790-4e96-bb56-bd25ff2a5939 req-221a8808-b6d7-41e6-b113-e3dd7a93b1e6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-d372b816-e400-40ea-9e6b-cc8c21e54bc6 for instance with vm_state active and task_state None.
Jan 23 10:23:04 compute-1 sudo[234618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:23:04 compute-1 sudo[234618]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:04 compute-1 sudo[234618]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_36] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef8004b10 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:05 compute-1 nova_compute[225705]: 2026-01-23 10:23:05.686 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:23:05 compute-1 ceph-mon[80126]: pgmap v894: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 8.4 KiB/s wr, 30 op/s
Jan 23 10:23:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:23:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:05 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[reaper] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:23:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:06.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:06 compute-1 nova_compute[225705]: 2026-01-23 10:23:06.612 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:06 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:06.835 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1596926985' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:07 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:07 compute-1 nova_compute[225705]: 2026-01-23 10:23:07.523 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:07 compute-1 nova_compute[225705]: 2026-01-23 10:23:07.541 225709 DEBUG nova.compute.manager [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-deleted-d372b816-e400-40ea-9e6b-cc8c21e54bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:23:07 compute-1 nova_compute[225705]: 2026-01-23 10:23:07.541 225709 INFO nova.compute.manager [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Neutron deleted interface d372b816-e400-40ea-9e6b-cc8c21e54bc6; detaching it from the instance and deleting it from the info cache
Jan 23 10:23:07 compute-1 nova_compute[225705]: 2026-01-23 10:23:07.541 225709 DEBUG nova.network.neutron [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:23:07 compute-1 nova_compute[225705]: 2026-01-23 10:23:07.930 225709 DEBUG nova.network.neutron [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:23:08 compute-1 ceph-mon[80126]: pgmap v895: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 8.8 KiB/s wr, 32 op/s
Jan 23 10:23:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2002394447' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.117 225709 DEBUG nova.objects.instance [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lazy-loading 'system_metadata' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:23:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:08.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee800c4d0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:08 compute-1 podman[234645]: 2026-01-23 10:23:08.719456618 +0000 UTC m=+0.113703100 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Jan 23 10:23:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:08 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.828 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.829 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.829 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.829 225709 DEBUG nova.network.neutron [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.830 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.831 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.832 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.832 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.832 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:08.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:08 compute-1 nova_compute[225705]: 2026-01-23 10:23:08.869 225709 DEBUG nova.objects.instance [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lazy-loading 'flavor' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:23:09 compute-1 ovn_controller[133293]: 2026-01-23T10:23:09Z|00071|binding|INFO|Releasing lport f2152a4c-f6dc-4a1a-bc3c-2fda8af505ba from this chassis (sb_readonly=0)
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.087 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:09 compute-1 ceph-mon[80126]: pgmap v896: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 3.2 KiB/s wr, 31 op/s
Jan 23 10:23:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/644210232' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.245 225709 DEBUG nova.virt.libvirt.vif [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.246 225709 DEBUG nova.network.os_vif_util [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.247 225709 DEBUG nova.network.os_vif_util [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.251 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.253 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.253 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.254 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.254 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.254 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.282 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <name>instance-00000006</name>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:23:02</nova:creationTime>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:23:09 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <memory unit='KiB'>131072</memory>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <vcpu placement='static'>1</vcpu>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <resource>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <partition>/machine</partition>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </resource>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <sysinfo type='smbios'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <system>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='manufacturer'>RDO</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='serial'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='uuid'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='family'>Virtual Machine</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </system>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <os>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <boot dev='hd'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <smbios mode='sysinfo'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </os>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <features>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <vmcoreinfo state='on'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </features>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <vendor>AMD</vendor>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='x2apic'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc-deadline'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='hypervisor'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc_adjust'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='spec-ctrl'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='stibp'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='ssbd'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='cmp_legacy'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='overflow-recov'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='succor'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='ibrs'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='amd-ssbd'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='virt-ssbd'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='lbrv'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='tsc-scale'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='vmcb-clean'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='flushbyasid'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='pause-filter'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='pfthreshold'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='xsaves'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='svm'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='topoext'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='npt'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='nrip-save'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <clock offset='utc'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <timer name='hpet' present='no'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <on_poweroff>destroy</on_poweroff>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <on_reboot>restart</on_reboot>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <on_crash>destroy</on_crash>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <disk type='network' device='disk'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk' index='2'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </source>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target dev='vda' bus='virtio'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='virtio-disk0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <disk type='network' device='cdrom'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config' index='1'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </source>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target dev='sda' bus='sata'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <readonly/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='sata0-0-0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pcie.0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='1' port='0x10'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='2' port='0x11'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='3' port='0x12'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.3'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='4' port='0x13'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.4'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='5' port='0x14'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.5'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='6' port='0x15'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.6'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='7' port='0x16'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.7'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='8' port='0x17'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.8'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='9' port='0x18'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.9'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='10' port='0x19'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.10'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='11' port='0x1a'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.11'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='12' port='0x1b'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.12'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='13' port='0x1c'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.13'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='14' port='0x1d'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.14'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='15' port='0x1e'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.15'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='16' port='0x1f'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.16'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='17' port='0x20'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.17'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='18' port='0x21'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.18'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='19' port='0x22'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.19'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='20' port='0x23'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.20'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='21' port='0x24'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.21'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='22' port='0x25'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.22'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='23' port='0x26'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.23'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='24' port='0x27'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.24'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='25' port='0x28'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.25'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-pci-bridge'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.26'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='usb'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='sata' index='0'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='ide'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:b7:90:a0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target dev='tapdfaa68a5-31'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='net0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <serial type='pty'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target type='isa-serial' port='0'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <model name='isa-serial'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </target>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target type='serial' port='0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </console>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <input type='tablet' bus='usb'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='input0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='usb' bus='0' port='1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <input type='mouse' bus='ps2'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='input1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <input type='keyboard' bus='ps2'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='input2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <listen type='address' address='::0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <audio id='1' type='none'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <video>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='video0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </video>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <watchdog model='itco' action='reset'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='watchdog0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </watchdog>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <memballoon model='virtio'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <stats period='10'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='balloon0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <rng model='virtio'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <backend model='random'>/dev/urandom</backend>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='rng0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <label>system_u:system_r:svirt_t:s0:c125,c442</label>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c125,c442</imagelabel>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <label>+107:+107</label>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <imagelabel>+107:+107</imagelabel>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:23:09 compute-1 nova_compute[225705]: </domain>
Jan 23 10:23:09 compute-1 nova_compute[225705]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.284 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.296 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:02:7d:ea"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd372b816-e4"/></interface>not found in domain: <domain type='kvm' id='3'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <name>instance-00000006</name>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <uuid>db7b623e-73d8-45e2-b7eb-1a861bef62c2</uuid>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:23:02</nova:creationTime>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:23:09 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <memory unit='KiB'>131072</memory>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <currentMemory unit='KiB'>131072</currentMemory>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <vcpu placement='static'>1</vcpu>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <resource>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <partition>/machine</partition>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </resource>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <sysinfo type='smbios'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <system>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='manufacturer'>RDO</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='product'>OpenStack Compute</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='serial'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='uuid'>db7b623e-73d8-45e2-b7eb-1a861bef62c2</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <entry name='family'>Virtual Machine</entry>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </system>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <os>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <type arch='x86_64' machine='pc-q35-rhel9.8.0'>hvm</type>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <boot dev='hd'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <smbios mode='sysinfo'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </os>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <features>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <vmcoreinfo state='on'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </features>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <cpu mode='custom' match='exact' check='full'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <model fallback='forbid'>EPYC-Rome</model>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <vendor>AMD</vendor>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='x2apic'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc-deadline'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='hypervisor'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='tsc_adjust'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='spec-ctrl'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='stibp'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='ssbd'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='cmp_legacy'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='overflow-recov'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='succor'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='ibrs'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='amd-ssbd'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='virt-ssbd'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='lbrv'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='tsc-scale'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='vmcb-clean'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='flushbyasid'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='pause-filter'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='pfthreshold'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='svme-addr-chk'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='lfence-always-serializing'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='xsaves'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='svm'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='require' name='topoext'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='npt'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <feature policy='disable' name='nrip-save'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <clock offset='utc'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <timer name='pit' tickpolicy='delay'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <timer name='rtc' tickpolicy='catchup'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <timer name='hpet' present='no'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <on_poweroff>destroy</on_poweroff>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <on_reboot>restart</on_reboot>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <on_crash>destroy</on_crash>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <emulator>/usr/libexec/qemu-kvm</emulator>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <disk type='network' device='disk'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk' index='2'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </source>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target dev='vda' bus='virtio'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='virtio-disk0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <disk type='network' device='cdrom'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <driver name='qemu' type='raw' cache='none'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <auth username='openstack'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <secret type='ceph' uuid='f3005f84-239a-55b6-a948-8f1fb592b920'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <source protocol='rbd' name='vms/db7b623e-73d8-45e2-b7eb-1a861bef62c2_disk.config' index='1'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.100' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.102' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <host name='192.168.122.101' port='6789'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </source>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target dev='sda' bus='sata'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <readonly/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='sata0-0-0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='0' model='pcie-root'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pcie.0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='1' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='1' port='0x10'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='2' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='2' port='0x11'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='3' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='3' port='0x12'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.3'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='4' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='4' port='0x13'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.4'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='5' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='5' port='0x14'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.5'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='6' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='6' port='0x15'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.6'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='7' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='7' port='0x16'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.7'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='8' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='8' port='0x17'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.8'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='9' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='9' port='0x18'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.9'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='10' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='10' port='0x19'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.10'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='11' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='11' port='0x1a'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.11'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='12' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='12' port='0x1b'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.12'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='13' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='13' port='0x1c'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.13'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='14' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='14' port='0x1d'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.14'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='15' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='15' port='0x1e'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.15'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='16' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='16' port='0x1f'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.16'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='17' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='17' port='0x20'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.17'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='18' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='18' port='0x21'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.18'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='19' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='19' port='0x22'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.19'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='20' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='20' port='0x23'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.20'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='21' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='21' port='0x24'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.21'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='22' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='22' port='0x25'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.22'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='23' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='23' port='0x26'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.23'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='24' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='24' port='0x27'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.24'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='25' model='pcie-root-port'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-root-port'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target chassis='25' port='0x28'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.25'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model name='pcie-pci-bridge'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='pci.26'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='usb' index='0' model='piix3-uhci'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='usb'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <controller type='sata' index='0'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='ide'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </controller>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <interface type='ethernet'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <mac address='fa:16:3e:b7:90:a0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target dev='tapdfaa68a5-31'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model type='virtio'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <driver name='vhost' rx_queue_size='512'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <mtu size='1442'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='net0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <serial type='pty'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target type='isa-serial' port='0'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:         <model name='isa-serial'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       </target>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <console type='pty' tty='/dev/pts/0'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <source path='/dev/pts/0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <log file='/var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2/console.log' append='off'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <target type='serial' port='0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='serial0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </console>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <input type='tablet' bus='usb'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='input0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='usb' bus='0' port='1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <input type='mouse' bus='ps2'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='input1'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <input type='keyboard' bus='ps2'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='input2'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </input>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <listen type='address' address='::0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </graphics>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <audio id='1' type='none'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <video>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <model type='virtio' heads='1' primary='yes'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='video0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </video>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <watchdog model='itco' action='reset'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='watchdog0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </watchdog>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <memballoon model='virtio'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <stats period='10'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='balloon0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <rng model='virtio'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <backend model='random'>/dev/urandom</backend>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <alias name='rng0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='selinux' relabel='yes'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <label>system_u:system_r:svirt_t:s0:c125,c442</label>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <imagelabel>system_u:object_r:svirt_image_t:s0:c125,c442</imagelabel>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <seclabel type='dynamic' model='dac' relabel='yes'>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <label>+107:+107</label>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <imagelabel>+107:+107</imagelabel>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </seclabel>
Jan 23 10:23:09 compute-1 nova_compute[225705]: </domain>
Jan 23 10:23:09 compute-1 nova_compute[225705]:  get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.296 225709 WARNING nova.virt.libvirt.driver [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Detaching interface fa:16:3e:02:7d:ea failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapd372b816-e4' not found.
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.297 225709 DEBUG nova.virt.libvirt.vif [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.298 225709 DEBUG nova.network.os_vif_util [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converting VIF {"id": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "address": "fa:16:3e:02:7d:ea", "network": {"id": "6a09a282-aa22-47cf-a68d-ce0dba493868", "bridge": "br-int", "label": "tempest-network-smoke--1499919951", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.20", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd372b816-e4", "ovs_interfaceid": "d372b816-e400-40ea-9e6b-cc8c21e54bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.299 225709 DEBUG nova.network.os_vif_util [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.299 225709 DEBUG os_vif [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.302 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.303 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd372b816-e4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.303 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.306 225709 INFO os_vif [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:02:7d:ea,bridge_name='br-int',has_traffic_filtering=True,id=d372b816-e400-40ea-9e6b-cc8c21e54bc6,network=Network(6a09a282-aa22-47cf-a68d-ce0dba493868),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd372b816-e4')
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.307 225709 DEBUG nova.virt.libvirt.guest [req-0ff8a640-9a51-4646-acfa-659a24c00751 req-5caae38c-70cb-4b41-9b87-4611f4098e5a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:name>tempest-TestNetworkBasicOps-server-1937921770</nova:name>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:creationTime>2026-01-23 10:23:09</nova:creationTime>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:flavor name="m1.nano">
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:memory>128</nova:memory>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:disk>1</nova:disk>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:swap>0</nova:swap>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:vcpus>1</nova:vcpus>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </nova:flavor>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:owner>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </nova:owner>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   <nova:ports>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     <nova:port uuid="dfaa68a5-31a2-4de5-996e-11936357ca9b">
Jan 23 10:23:09 compute-1 nova_compute[225705]:       <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Jan 23 10:23:09 compute-1 nova_compute[225705]:     </nova:port>
Jan 23 10:23:09 compute-1 nova_compute[225705]:   </nova:ports>
Jan 23 10:23:09 compute-1 nova_compute[225705]: </nova:instance>
Jan 23 10:23:09 compute-1 nova_compute[225705]:  set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359
Jan 23 10:23:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:09 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:23:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1488791737' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:09 compute-1 nova_compute[225705]: 2026-01-23 10:23:09.760 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.505s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:23:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1432565687' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1488791737' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:10.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:10 compute-1 nova_compute[225705]: 2026-01-23 10:23:10.662 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:23:10 compute-1 nova_compute[225705]: 2026-01-23 10:23:10.662 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-00000006 as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:23:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_31] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:10 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:10.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:10 compute-1 nova_compute[225705]: 2026-01-23 10:23:10.928 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:23:10 compute-1 nova_compute[225705]: 2026-01-23 10:23:10.929 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4732MB free_disk=59.942562103271484GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:23:10 compute-1 nova_compute[225705]: 2026-01-23 10:23:10.930 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:10 compute-1 nova_compute[225705]: 2026-01-23 10:23:10.930 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:10 compute-1 nova_compute[225705]: 2026-01-23 10:23:10.993 225709 DEBUG nova.compute.manager [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:23:10 compute-1 nova_compute[225705]: 2026-01-23 10:23:10.994 225709 DEBUG nova.compute.manager [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing instance network info cache due to event network-changed-dfaa68a5-31a2-4de5-996e-11936357ca9b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:23:10 compute-1 nova_compute[225705]: 2026-01-23 10:23:10.994 225709 DEBUG oslo_concurrency.lockutils [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.034 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance db7b623e-73d8-45e2-b7eb-1a861bef62c2 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.035 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.035 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.094 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.095 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.096 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.096 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.097 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.099 225709 INFO nova.compute.manager [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Terminating instance
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.101 225709 DEBUG nova.compute.manager [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.138 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:23:11 compute-1 kernel: tapdfaa68a5-31 (unregistering): left promiscuous mode
Jan 23 10:23:11 compute-1 NetworkManager[48978]: <info>  [1769163791.1508] device (tapdfaa68a5-31): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:23:11 compute-1 ovn_controller[133293]: 2026-01-23T10:23:11Z|00072|binding|INFO|Releasing lport dfaa68a5-31a2-4de5-996e-11936357ca9b from this chassis (sb_readonly=0)
Jan 23 10:23:11 compute-1 ovn_controller[133293]: 2026-01-23T10:23:11Z|00073|binding|INFO|Setting lport dfaa68a5-31a2-4de5-996e-11936357ca9b down in Southbound
Jan 23 10:23:11 compute-1 ovn_controller[133293]: 2026-01-23T10:23:11Z|00074|binding|INFO|Removing iface tapdfaa68a5-31 ovn-installed in OVS
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.163 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.178 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b7:90:a0 10.100.0.11'], port_security=['fa:16:3e:b7:90:a0 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'db7b623e-73d8-45e2-b7eb-1a861bef62c2', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8afa87f8-5b22-4350-8bf2-c7af019c3372', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9dbc1781-4648-4570-b3c6-0353674ab246, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=dfaa68a5-31a2-4de5-996e-11936357ca9b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.179 143098 INFO neutron.agent.ovn.metadata.agent [-] Port dfaa68a5-31a2-4de5-996e-11936357ca9b in datapath eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 unbound from our chassis
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.180 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.181 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8e7cc74c-15e2-43b2-87ed-8dd2db394424]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.182 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 namespace which is not needed anymore
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.194 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:11 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Jan 23 10:23:11 compute-1 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 17.604s CPU time.
Jan 23 10:23:11 compute-1 systemd-machined[194551]: Machine qemu-3-instance-00000006 terminated.
Jan 23 10:23:11 compute-1 ceph-mon[80126]: pgmap v897: 353 pgs: 353 active+clean; 121 MiB data, 307 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.2 KiB/s wr, 29 op/s
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.326 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:11 compute-1 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [NOTICE]   (234054) : haproxy version is 2.8.14-c23fe91
Jan 23 10:23:11 compute-1 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [NOTICE]   (234054) : path to executable is /usr/sbin/haproxy
Jan 23 10:23:11 compute-1 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [ALERT]    (234054) : Current worker (234056) exited with code 143 (Terminated)
Jan 23 10:23:11 compute-1 neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2[234048]: [WARNING]  (234054) : All workers exited. Exiting... (0)
Jan 23 10:23:11 compute-1 systemd[1]: libpod-77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee.scope: Deactivated successfully.
Jan 23 10:23:11 compute-1 podman[234723]: 2026-01-23 10:23:11.334991779 +0000 UTC m=+0.058947496 container died 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.335 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.343 225709 INFO nova.virt.libvirt.driver [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Instance destroyed successfully.
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.344 225709 DEBUG nova.objects.instance [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid db7b623e-73d8-45e2-b7eb-1a861bef62c2 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:23:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:11 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:11 compute-1 systemd[1]: var-lib-containers-storage-overlay-33d908cc39ee3158e23a819479dadd5bc8e191abccd206682925f5884fa34301-merged.mount: Deactivated successfully.
Jan 23 10:23:11 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee-userdata-shm.mount: Deactivated successfully.
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.379 225709 DEBUG nova.virt.libvirt.vif [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:21:30Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1937921770',display_name='tempest-TestNetworkBasicOps-server-1937921770',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1937921770',id=6,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLCeGctEXJFE/zWu++kuUPZPEUGdZ1tAUfr59UaFIa+7QXB9PaN66ZnnrB5CAzkkoYqyfCxGwKiH7tIv3Z3Id2K8u9ymKR9c9j/fpM9j4VjkMST29u+6K6TF+84g8VYZ2w==',key_name='tempest-TestNetworkBasicOps-1095646374',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:21:44Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-v7ncu2ru',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:21:44Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=db7b623e-73d8-45e2-b7eb-1a861bef62c2,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.379 225709 DEBUG nova.network.os_vif_util [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.180", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.380 225709 DEBUG nova.network.os_vif_util [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.380 225709 DEBUG os_vif [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.381 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.382 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdfaa68a5-31, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.385 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:23:11 compute-1 podman[234723]: 2026-01-23 10:23:11.389797444 +0000 UTC m=+0.113753131 container cleanup 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.389 225709 INFO os_vif [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b7:90:a0,bridge_name='br-int',has_traffic_filtering=True,id=dfaa68a5-31a2-4de5-996e-11936357ca9b,network=Network(eae9e618-a7c2-43e9-ab46-9070ca2ef7f2),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapdfaa68a5-31')
Jan 23 10:23:11 compute-1 systemd[1]: libpod-conmon-77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee.scope: Deactivated successfully.
Jan 23 10:23:11 compute-1 podman[234791]: 2026-01-23 10:23:11.465217792 +0000 UTC m=+0.051284397 container remove 77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.474 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c04b5cc9-7b4d-4e29-95aa-cc6d2488671a]: (4, ('Fri Jan 23 10:23:11 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 (77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee)\n77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee\nFri Jan 23 10:23:11 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 (77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee)\n77d77bb026c32d51e75f6620042e19893329adeba99c41ecaeed5ef6a083c7ee\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.476 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[6bb4ec96-72c9-4ba9-aa4a-1197302c2002]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.477 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeae9e618-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.479 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:11 compute-1 kernel: tapeae9e618-a0: left promiscuous mode
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.499 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[cb6bc248-e06a-429f-90d4-5cb6fe024cf5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.517 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7b603179-c186-42b2-994e-4dfab042785f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.521 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[66627b4e-35f5-46d6-8920-28644d9053a6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.540 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[4e4ae76a-8b97-4534-958a-dfa9fe36b0e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 485745, 'reachable_time': 31343, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 234819, 'error': None, 'target': 'ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.544 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eae9e618-a7c2-43e9-ab46-9070ca2ef7f2 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:23:11 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:11.544 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[45687743-82da-41ba-86e5-940de42c2c29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:23:11 compute-1 systemd[1]: run-netns-ovnmeta\x2deae9e618\x2da7c2\x2d43e9\x2dab46\x2d9070ca2ef7f2.mount: Deactivated successfully.
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.613 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:11 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:23:11 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/721921800' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.644 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.652 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.688 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.689 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:23:11 compute-1 nova_compute[225705]: 2026-01-23 10:23:11.689 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.759s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.069 225709 DEBUG nova.network.neutron [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.085 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.087 225709 DEBUG oslo_concurrency.lockutils [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.087 225709 DEBUG nova.network.neutron [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Refreshing network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.111 225709 DEBUG oslo_concurrency.lockutils [None req-fed61897-7eed-45ad-b507-5350d7ba9e91 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "interface-db7b623e-73d8-45e2-b7eb-1a861bef62c2-d372b816-e400-40ea-9e6b-cc8c21e54bc6" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 9.800s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.286 225709 INFO nova.virt.libvirt.driver [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Deleting instance files /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2_del
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.287 225709 INFO nova.virt.libvirt.driver [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Deletion of /var/lib/nova/instances/db7b623e-73d8-45e2-b7eb-1a861bef62c2_del complete
Jan 23 10:23:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/721921800' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.331 225709 INFO nova.compute.manager [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Took 1.23 seconds to destroy the instance on the hypervisor.
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.332 225709 DEBUG oslo.service.loopingcall [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.332 225709 DEBUG nova.compute.manager [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.332 225709 DEBUG nova.network.neutron [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 10:23:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:12.368 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102312 (4) : Server backend/nfs.cephfs.1 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:23:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.731 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:12 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.751 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.752 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:23:12 compute-1 nova_compute[225705]: 2026-01-23 10:23:12.752 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:23:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:12.845 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.148 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-unplugged-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.149 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.149 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.149 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.150 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-unplugged-dfaa68a5-31a2-4de5-996e-11936357ca9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.150 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-unplugged-dfaa68a5-31a2-4de5-996e-11936357ca9b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.150 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.151 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.151 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.151 225709 DEBUG oslo_concurrency.lockutils [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.152 225709 DEBUG nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] No waiting events found dispatching network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.152 225709 WARNING nova.compute.manager [req-1e6936a1-36bf-4d4f-827a-b59ab9f098f2 req-729b4fa9-96be-483b-99fc-2fa380a2776b 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received unexpected event network-vif-plugged-dfaa68a5-31a2-4de5-996e-11936357ca9b for instance with vm_state active and task_state deleting.
Jan 23 10:23:13 compute-1 ceph-mon[80126]: pgmap v898: 353 pgs: 353 active+clean; 80 MiB data, 282 MiB used, 60 GiB / 60 GiB avail; 23 KiB/s rd, 5.2 KiB/s wr, 35 op/s
Jan 23 10:23:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:13 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.897 225709 DEBUG nova.network.neutron [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.924 225709 INFO nova.compute.manager [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Took 1.59 seconds to deallocate network for instance.
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.982 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.982 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:13 compute-1 nova_compute[225705]: 2026-01-23 10:23:13.996 225709 DEBUG nova.compute.manager [req-c026a045-b632-4c6b-8d6c-ab509f3a3f72 req-299df4e3-51e4-47f1-98a4-29f28d2d21ee 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Received event network-vif-deleted-dfaa68a5-31a2-4de5-996e-11936357ca9b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.036 225709 DEBUG oslo_concurrency.processutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.172 225709 DEBUG nova.network.neutron [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updated VIF entry in instance network info cache for port dfaa68a5-31a2-4de5-996e-11936357ca9b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.173 225709 DEBUG nova.network.neutron [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Updating instance_info_cache with network_info: [{"id": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "address": "fa:16:3e:b7:90:a0", "network": {"id": "eae9e618-a7c2-43e9-ab46-9070ca2ef7f2", "bridge": "br-int", "label": "tempest-network-smoke--1113553220", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapdfaa68a5-31", "ovs_interfaceid": "dfaa68a5-31a2-4de5-996e-11936357ca9b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.196 225709 DEBUG oslo_concurrency.lockutils [req-3b15074b-73d9-478a-92f3-5cf141c16af3 req-0cd55ae7-e255-4ec7-9bbe-834d52db6e1f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-db7b623e-73d8-45e2-b7eb-1a861bef62c2" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:23:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:14.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:23:14 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1573239966' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.548 225709 DEBUG oslo_concurrency.processutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.512s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.556 225709 DEBUG nova.compute.provider_tree [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.579 225709 DEBUG nova.scheduler.client.report [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.602 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.628 225709 INFO nova.scheduler.client.report [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance db7b623e-73d8-45e2-b7eb-1a861bef62c2
Jan 23 10:23:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:14 compute-1 nova_compute[225705]: 2026-01-23 10:23:14.703 225709 DEBUG oslo_concurrency.lockutils [None req-c5371617-aa32-4cf3-b3d5-71b7bb6705d1 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "db7b623e-73d8-45e2-b7eb-1a861bef62c2" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.608s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1573239966' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:14 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:14.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:15 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:15 compute-1 ceph-mon[80126]: pgmap v899: 353 pgs: 353 active+clean; 80 MiB data, 282 MiB used, 60 GiB / 60 GiB avail; 4.0 KiB/s rd, 2.4 KiB/s wr, 7 op/s
Jan 23 10:23:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:16.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:16 compute-1 nova_compute[225705]: 2026-01-23 10:23:16.385 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:16 compute-1 nova_compute[225705]: 2026-01-23 10:23:16.614 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:16 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:16.851 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:17 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:17 compute-1 ceph-mon[80126]: pgmap v900: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 3.2 KiB/s wr, 30 op/s
Jan 23 10:23:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:18.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:18 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:18.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:19 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:19 compute-1 ceph-mon[80126]: pgmap v901: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.8 KiB/s wr, 28 op/s
Jan 23 10:23:19 compute-1 nova_compute[225705]: 2026-01-23 10:23:19.700 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:19 compute-1 nova_compute[225705]: 2026-01-23 10:23:19.804 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:20.380 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:23:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:20 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:20.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:21 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:21 compute-1 nova_compute[225705]: 2026-01-23 10:23:21.389 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:21 compute-1 ceph-mon[80126]: pgmap v902: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.8 KiB/s wr, 28 op/s
Jan 23 10:23:21 compute-1 nova_compute[225705]: 2026-01-23 10:23:21.618 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:22.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:22 compute-1 podman[234851]: 2026-01-23 10:23:22.640632205 +0000 UTC m=+0.048755578 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Jan 23 10:23:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:22 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:22.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:23 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:23 compute-1 ceph-mon[80126]: pgmap v903: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 2.8 KiB/s wr, 28 op/s
Jan 23 10:23:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:24.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:24 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:24 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_35] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef0001090 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:24.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:25 compute-1 sudo[234872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:23:25 compute-1 sudo[234872]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:25 compute-1 sudo[234872]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:25 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:25 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faee0002690 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:25 compute-1 ceph-mon[80126]: pgmap v904: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 852 B/s wr, 22 op/s
Jan 23 10:23:26 compute-1 nova_compute[225705]: 2026-01-23 10:23:26.341 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163791.3396697, db7b623e-73d8-45e2-b7eb-1a861bef62c2 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:23:26 compute-1 nova_compute[225705]: 2026-01-23 10:23:26.341 225709 INFO nova.compute.manager [-] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] VM Stopped (Lifecycle Event)
Jan 23 10:23:26 compute-1 nova_compute[225705]: 2026-01-23 10:23:26.369 225709 DEBUG nova.compute.manager [None req-21a88963-318e-48c9-95f1-56c738fa502d - - - - - -] [instance: db7b623e-73d8-45e2-b7eb-1a861bef62c2] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:23:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:26.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:26 compute-1 nova_compute[225705]: 2026-01-23 10:23:26.394 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:26 compute-1 nova_compute[225705]: 2026-01-23 10:23:26.621 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_32] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faf14003ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:26 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:26 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_34] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:26.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:27 compute-1 kernel: ganesha.nfsd[234383]: segfault at 50 ip 00007faf9a38932e sp 00007faf037fd210 error 4 in libntirpc.so.5.8[7faf9a36e000+2c000] likely on CPU 2 (core 0, socket 2)
Jan 23 10:23:27 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:23:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[228646]: 23/01/2026 10:23:27 : epoch 69734a69 : compute-1 : ganesha.nfsd-2[svc_37] rpc :TIRPC :EVENT :svc_vc_recv: 0x7faef4001ba0 fd 42 proxy ignored for local
Jan 23 10:23:27 compute-1 systemd[1]: Started Process Core Dump (PID 234898/UID 0).
Jan 23 10:23:27 compute-1 ceph-mon[80126]: pgmap v905: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 853 B/s wr, 22 op/s
Jan 23 10:23:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:28.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:28 compute-1 systemd-coredump[234899]: Process 228650 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 79:
                                                    #0  0x00007faf9a38932e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:23:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:28 compute-1 systemd[1]: systemd-coredump@10-234898-0.service: Deactivated successfully.
Jan 23 10:23:28 compute-1 systemd[1]: systemd-coredump@10-234898-0.service: Consumed 1.217s CPU time.
Jan 23 10:23:28 compute-1 podman[234905]: 2026-01-23 10:23:28.756768728 +0000 UTC m=+0.031872833 container died 0b872f0e8bc76b7d6eee9eacb5ff8971176a2ec887fbe703344a6669530bce26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:23:28 compute-1 systemd[1]: var-lib-containers-storage-overlay-bed27c64872bf8529345969a4191e92df6ccf355f83eb481cb6d4f93143e1094-merged.mount: Deactivated successfully.
Jan 23 10:23:28 compute-1 podman[234905]: 2026-01-23 10:23:28.80085709 +0000 UTC m=+0.075961125 container remove 0b872f0e8bc76b7d6eee9eacb5ff8971176a2ec887fbe703344a6669530bce26 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:23:28 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:23:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:28.872 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:29 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 10:23:29 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.409s CPU time.
Jan 23 10:23:29 compute-1 ceph-mon[80126]: pgmap v906: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:23:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:30.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:30 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:30.517 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:23:30 compute-1 nova_compute[225705]: 2026-01-23 10:23:30.518 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:30 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:30.518 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:23:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:30.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:31 compute-1 nova_compute[225705]: 2026-01-23 10:23:31.397 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:31 compute-1 nova_compute[225705]: 2026-01-23 10:23:31.623 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:32 compute-1 ceph-mon[80126]: pgmap v907: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:23:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:32.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:32.521 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:23:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:32.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:33 compute-1 ceph-mon[80126]: pgmap v908: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:23:33 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102333 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:23:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:34.405 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:34.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:35 compute-1 ceph-mon[80126]: pgmap v909: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 255 B/s rd, 0 op/s
Jan 23 10:23:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:23:36 compute-1 nova_compute[225705]: 2026-01-23 10:23:36.400 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:36.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:36 compute-1 nova_compute[225705]: 2026-01-23 10:23:36.625 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:36.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:38 compute-1 ceph-mon[80126]: pgmap v910: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 341 B/s rd, 0 op/s
Jan 23 10:23:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:38.411 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:38.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:39 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 11.
Jan 23 10:23:39 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:23:39 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 2.409s CPU time.
Jan 23 10:23:39 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:23:39 compute-1 ceph-mon[80126]: pgmap v911: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:23:39 compute-1 podman[234954]: 2026-01-23 10:23:39.228403093 +0000 UTC m=+0.125624192 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Jan 23 10:23:39 compute-1 podman[235025]: 2026-01-23 10:23:39.383767569 +0000 UTC m=+0.072312231 container create 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0)
Jan 23 10:23:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:23:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:23:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:23:39 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:23:39 compute-1 podman[235025]: 2026-01-23 10:23:39.356217842 +0000 UTC m=+0.044762554 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:23:39 compute-1 podman[235025]: 2026-01-23 10:23:39.454181492 +0000 UTC m=+0.142726224 container init 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 10:23:39 compute-1 podman[235025]: 2026-01-23 10:23:39.459878289 +0000 UTC m=+0.148422931 container start 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.40.1, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True)
Jan 23 10:23:39 compute-1 bash[235025]: 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a
Jan 23 10:23:39 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:23:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:23:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:23:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:23:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:23:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:23:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:23:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:23:39 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:39 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:23:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:40.414 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:40.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:41 compute-1 nova_compute[225705]: 2026-01-23 10:23:41.405 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:41 compute-1 nova_compute[225705]: 2026-01-23 10:23:41.626 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:41 compute-1 ceph-mon[80126]: pgmap v912: 353 pgs: 353 active+clean; 41 MiB data, 262 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.869178) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821869209, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 1625, "num_deletes": 505, "total_data_size": 3294967, "memory_usage": 3352472, "flush_reason": "Manual Compaction"}
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821880260, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 1383209, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28237, "largest_seqno": 29857, "table_properties": {"data_size": 1377972, "index_size": 2057, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 16001, "raw_average_key_size": 19, "raw_value_size": 1364717, "raw_average_value_size": 1630, "num_data_blocks": 90, "num_entries": 837, "num_filter_entries": 837, "num_deletions": 505, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163706, "oldest_key_time": 1769163706, "file_creation_time": 1769163821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 11121 microseconds, and 4914 cpu microseconds.
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.880298) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 1383209 bytes OK
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.880316) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.882226) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.882239) EVENT_LOG_v1 {"time_micros": 1769163821882235, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.882257) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3286579, prev total WAL file size 3286579, number of live WAL files 2.
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.883095) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D00353130' seq:72057594037927935, type:22 .. '6C6F676D00373631' seq:0, type:0; will stop at (end)
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(1350KB)], [54(14MB)]
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821883171, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 16235616, "oldest_snapshot_seqno": -1}
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 5784 keys, 12761688 bytes, temperature: kUnknown
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821994316, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 12761688, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12724317, "index_size": 21837, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14469, "raw_key_size": 149288, "raw_average_key_size": 25, "raw_value_size": 12620745, "raw_average_value_size": 2182, "num_data_blocks": 877, "num_entries": 5784, "num_filter_entries": 5784, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163821, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.994751) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 12761688 bytes
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.997001) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 145.9 rd, 114.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.3, 14.2 +0.0 blob) out(12.2 +0.0 blob), read-write-amplify(21.0) write-amplify(9.2) OK, records in: 6762, records dropped: 978 output_compression: NoCompression
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.997034) EVENT_LOG_v1 {"time_micros": 1769163821997019, "job": 32, "event": "compaction_finished", "compaction_time_micros": 111249, "compaction_time_cpu_micros": 53481, "output_level": 6, "num_output_files": 1, "total_output_size": 12761688, "num_input_records": 6762, "num_output_records": 5784, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:23:41 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163821997701, "job": 32, "event": "table_file_deletion", "file_number": 56}
Jan 23 10:23:42 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:23:42 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163822002614, "job": 32, "event": "table_file_deletion", "file_number": 54}
Jan 23 10:23:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:41.882992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002706) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002713) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002720) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:42 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:23:42.002723) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:23:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:42.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:42 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2132895342' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:42.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:43 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 10:23:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:43 compute-1 ceph-mon[80126]: pgmap v913: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:23:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:44.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:44.898 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:45 compute-1 sudo[235088]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:23:45 compute-1 sudo[235088]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:45 compute-1 sudo[235088]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:45 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:23:45 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:45 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:23:45 compute-1 ceph-mon[80126]: pgmap v914: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 597 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:23:45 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3313488858' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:23:45 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1984605226' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:23:46 compute-1 nova_compute[225705]: 2026-01-23 10:23:46.408 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:46.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:46 compute-1 nova_compute[225705]: 2026-01-23 10:23:46.629 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:46.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:47 compute-1 ceph-mon[80126]: pgmap v915: 353 pgs: 353 active+clean; 68 MiB data, 272 MiB used, 60 GiB / 60 GiB avail; 3.1 KiB/s rd, 1.1 MiB/s wr, 6 op/s
Jan 23 10:23:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:48.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:48.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1571219015' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:23:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1571219015' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:23:48 compute-1 ceph-mon[80126]: pgmap v916: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:23:49 compute-1 sudo[235116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:23:49 compute-1 sudo[235116]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:49 compute-1 sudo[235116]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:49 compute-1 sudo[235141]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:23:49 compute-1 sudo[235141]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:23:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:50.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:50 compute-1 sudo[235141]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:50.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:23:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:23:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:23:51 compute-1 ceph-mon[80126]: pgmap v917: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:23:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:23:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:23:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:23:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:23:51 compute-1 nova_compute[225705]: 2026-01-23 10:23:51.413 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_lift_grace_locked :STATE :EVENT :NFS Server Now NOT IN GRACE
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] main :NFS STARTUP :WARN :No export entries found in configuration file !!!
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:24): Unknown block (RADOS_URLS)
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] config_errs_to_log :CONFIG :WARN :Config File (/etc/ganesha/ganesha.conf:29): Unknown block (RGW)
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :CAP_SYS_RESOURCE was successfully removed for proper quota management in FSAL
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] lower_my_caps :NFS STARTUP :EVENT :currently set capabilities are: cap_chown,cap_dac_override,cap_fowner,cap_fsetid,cap_kill,cap_setgid,cap_setuid,cap_setpcap,cap_net_bind_service,cap_sys_chroot,cap_setfcap=ep
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_pkginit :DBUS :CRIT :dbus_bus_get failed (Failed to connect to socket /run/dbus/system_bus_socket: No such file or directory)
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:23:51 compute-1 nova_compute[225705]: 2026-01-23 10:23:51.631 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Init_svc :DISP :CRIT :Cannot acquire credentials for principal nfs
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gsh_dbus_register_path :DBUS :CRIT :dbus_connection_register_object_path called with no DBUS connection
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Init_admin_thread :NFS CB :EVENT :Admin thread initialized
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :EVENT :Callback creds directory (/var/run/ganesha) already exists
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] find_keytab_entry :NFS CB :WARN :Configuration file does not specify default realm while getting default realm name
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] gssd_refresh_krb5_machine_credential :NFS CB :CRIT :ERROR: gssd_refresh_krb5_machine_credential: no usable keytab entry found in keytab /etc/krb5.keytab for connection with host localhost
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_rpc_cb_init_ccache :NFS STARTUP :WARN :gssd_refresh_krb5_machine_credential failed (-1765328160:0)
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :Starting delayed executor.
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :gsh_dbusthread was started successfully
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :CRIT :DBUS not initialized, service thread exiting
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[dbus] gsh_dbus_thread :DBUS :EVENT :shutdown
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :admin thread was started successfully
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :reaper thread was started successfully
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_Start_threads :THREAD :EVENT :General fridge was started successfully
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :             NFS SERVER INITIALIZED
Jan 23 10:23:51 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:51 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[main] nfs_start :NFS STARTUP :EVENT :-------------------------------------------------
Jan 23 10:23:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:52.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:52 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9d4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:52 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:52.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:53 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:53 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:53 compute-1 podman[235215]: 2026-01-23 10:23:53.683998446 +0000 UTC m=+0.082676634 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:23:53 compute-1 ceph-mon[80126]: pgmap v918: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 104 op/s
Jan 23 10:23:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:54.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:54 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0000b60 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:54 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:54 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8000fa0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:23:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:54.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:23:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:55.053 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:23:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:55.054 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:23:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:23:55.054 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:23:55 compute-1 sudo[235234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:23:55 compute-1 sudo[235234]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:23:55 compute-1 sudo[235234]: pam_unix(sudo:session): session closed for user root
Jan 23 10:23:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102355 (4) : Server backend/nfs.cephfs.0 is UP, reason: Layer4 check passed, check duration: 0ms. 3 active and 0 backup servers online. 0 sessions requeued, 0 total in queue.
Jan 23 10:23:55 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:55 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:55 compute-1 ovn_controller[133293]: 2026-01-23T10:23:55Z|00075|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Jan 23 10:23:55 compute-1 ceph-mon[80126]: pgmap v919: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 103 op/s
Jan 23 10:23:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:23:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:23:56 compute-1 nova_compute[225705]: 2026-01-23 10:23:56.418 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:56.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:56 compute-1 nova_compute[225705]: 2026-01-23 10:23:56.677 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:23:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:56 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:56 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:56 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:56 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2411302437' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:23:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:56.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:57 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8001ac0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:57 compute-1 ceph-mon[80126]: pgmap v920: 353 pgs: 353 active+clean; 61 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 109 op/s
Jan 23 10:23:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:23:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:23:58.442 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:23:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:23:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:58 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:58 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:58 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc0016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:23:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:23:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:23:58.925 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:23:59 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:23:59 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:23:59 compute-1 ceph-mon[80126]: pgmap v921: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 737 KiB/s wr, 123 op/s
Jan 23 10:24:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 10:24:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:00.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 10:24:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:00 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:00 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:00 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:00.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:01 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:01 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:01 compute-1 nova_compute[225705]: 2026-01-23 10:24:01.422 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:01 compute-1 nova_compute[225705]: 2026-01-23 10:24:01.679 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:01 compute-1 ceph-mon[80126]: pgmap v922: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.5 KiB/s wr, 92 op/s
Jan 23 10:24:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:02.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:02 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:02 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b00016a0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:02.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:03 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:03 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:03 compute-1 ceph-mon[80126]: pgmap v923: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.5 KiB/s wr, 93 op/s
Jan 23 10:24:03 compute-1 nova_compute[225705]: 2026-01-23 10:24:03.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:03 compute-1 nova_compute[225705]: 2026-01-23 10:24:03.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:24:03 compute-1 nova_compute[225705]: 2026-01-23 10:24:03.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:24:03 compute-1 nova_compute[225705]: 2026-01-23 10:24:03.894 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:24:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:04.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:04 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:04 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:04 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:04.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:05 compute-1 sudo[235264]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:24:05 compute-1 sudo[235264]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:05 compute-1 sudo[235264]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:05 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:05 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:05 compute-1 nova_compute[225705]: 2026-01-23 10:24:05.887 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:05 compute-1 ceph-mon[80126]: pgmap v924: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Jan 23 10:24:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:24:06 compute-1 nova_compute[225705]: 2026-01-23 10:24:06.426 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:06.457 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:06 compute-1 nova_compute[225705]: 2026-01-23 10:24:06.680 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:06 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:06 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:06 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:06 compute-1 nova_compute[225705]: 2026-01-23 10:24:06.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:06.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3596100017' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:07 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b80023e0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:07 compute-1 nova_compute[225705]: 2026-01-23 10:24:07.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:07 compute-1 nova_compute[225705]: 2026-01-23 10:24:07.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.108 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.109 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.109 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.109 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.109 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:08 compute-1 ceph-mon[80126]: pgmap v925: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Jan 23 10:24:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1334627562' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1749144063' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:08.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:24:08 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1478679152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.615 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:08 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0002f00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:08 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:08 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc002720 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.826 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.827 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4908MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.828 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.828 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.930 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.931 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:24:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:08.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.949 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.985 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.985 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:24:08 compute-1 nova_compute[225705]: 2026-01-23 10:24:08.999 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:24:09 compute-1 nova_compute[225705]: 2026-01-23 10:24:09.029 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:24:09 compute-1 nova_compute[225705]: 2026-01-23 10:24:09.050 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1478679152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:09 compute-1 ceph-mon[80126]: pgmap v926: 353 pgs: 353 active+clean; 69 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 1.4 MiB/s wr, 34 op/s
Jan 23 10:24:09 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:09 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:24:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3295464724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:09 compute-1 nova_compute[225705]: 2026-01-23 10:24:09.540 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:09 compute-1 nova_compute[225705]: 2026-01-23 10:24:09.548 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:24:09 compute-1 nova_compute[225705]: 2026-01-23 10:24:09.563 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:24:09 compute-1 nova_compute[225705]: 2026-01-23 10:24:09.602 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:24:09 compute-1 nova_compute[225705]: 2026-01-23 10:24:09.602 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.774s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:09 compute-1 podman[235337]: 2026-01-23 10:24:09.698075723 +0000 UTC m=+0.100723456 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 10:24:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3295464724' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:10.463 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:10 compute-1 nova_compute[225705]: 2026-01-23 10:24:10.602 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:10 compute-1 nova_compute[225705]: 2026-01-23 10:24:10.603 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:10 compute-1 nova_compute[225705]: 2026-01-23 10:24:10.605 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:10 compute-1 nova_compute[225705]: 2026-01-23 10:24:10.605 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:24:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:10 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:10 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:10 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:10.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2577585059' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:11 compute-1 ceph-mon[80126]: pgmap v927: 353 pgs: 353 active+clean; 69 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 10 KiB/s rd, 1.4 MiB/s wr, 14 op/s
Jan 23 10:24:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/287250540' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/308211205' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:24:11 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:11 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:11 compute-1 nova_compute[225705]: 2026-01-23 10:24:11.429 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:11 compute-1 nova_compute[225705]: 2026-01-23 10:24:11.682 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/469737160' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:24:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:12.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:12 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:12 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:12 compute-1 nova_compute[225705]: 2026-01-23 10:24:12.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:24:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:12.950 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:13 compute-1 ceph-mon[80126]: pgmap v928: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Jan 23 10:24:13 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:13 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:14.468 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:14 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9bc003820 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:14 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:14 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:14.953 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:15 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:15 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:15 compute-1 ceph-mon[80126]: pgmap v929: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 29 op/s
Jan 23 10:24:16 compute-1 nova_compute[225705]: 2026-01-23 10:24:16.433 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:16.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:16 compute-1 nova_compute[225705]: 2026-01-23 10:24:16.684 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102416 (4) : Server backend/nfs.cephfs.1 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 2 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:24:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:16 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_6] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:16 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:16 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b0003c10 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:16.956 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:17 compute-1 ceph-mon[80126]: pgmap v930: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 265 KiB/s rd, 1.8 MiB/s wr, 42 op/s
Jan 23 10:24:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:17 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_10] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc001c00 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:24:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:18.474 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:24:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:18 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:18 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:18 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:18.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:19 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:19 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:19 compute-1 ceph-mon[80126]: pgmap v931: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:24:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:20.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:20 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:20 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:20 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc003cc0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:24:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:24:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:20.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:24:21 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:21 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_4] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9d4000df0 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:21 compute-1 nova_compute[225705]: 2026-01-23 10:24:21.437 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:21 compute-1 nova_compute[225705]: 2026-01-23 10:24:21.686 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:21 compute-1 ceph-mon[80126]: pgmap v932: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 359 KiB/s wr, 86 op/s
Jan 23 10:24:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:22.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:22 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_2] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:22 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9b8003730 fd 38 proxy header rest len failed header rlen = % (will set dead)
Jan 23 10:24:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:22.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:23 compute-1 kernel: ganesha.nfsd[235200]: segfault at 50 ip 00007fba5ddef32e sp 00007fb9f1ffa210 error 4 in libntirpc.so.5.8[7fba5ddd4000+2c000] likely on CPU 7 (core 0, socket 7)
Jan 23 10:24:23 compute-1 kernel: Code: 47 20 66 41 89 86 f2 00 00 00 41 bf 01 00 00 00 b9 40 00 00 00 e9 af fd ff ff 66 90 48 8b 85 f8 00 00 00 48 8b 40 08 4c 8b 28 <45> 8b 65 50 49 8b 75 68 41 8b be 28 02 00 00 b9 40 00 00 00 e8 29
Jan 23 10:24:23 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235041]: 23/01/2026 10:24:23 : epoch 69734c2b : compute-1 : ganesha.nfsd-2[svc_9] rpc :TIRPC :EVENT :svc_vc_recv: 0x7fb9cc003cc0 fd 38 proxy ignored for local
Jan 23 10:24:23 compute-1 systemd[1]: Started Process Core Dump (PID 235370/UID 0).
Jan 23 10:24:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:24 compute-1 ceph-mon[80126]: pgmap v933: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 360 KiB/s wr, 113 op/s
Jan 23 10:24:24 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2724804691' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:24.484 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:24 compute-1 systemd-coredump[235372]: Process 235046 (ganesha.nfsd) of user 0 dumped core.
                                                    
                                                    Stack trace of thread 41:
                                                    #0  0x00007fba5ddef32e n/a (/usr/lib64/libntirpc.so.5.8 + 0x2232e)
                                                    ELF object binary architecture: AMD x86-64
Jan 23 10:24:24 compute-1 podman[235373]: 2026-01-23 10:24:24.652199517 +0000 UTC m=+0.052465384 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent)
Jan 23 10:24:24 compute-1 systemd[1]: systemd-coredump@11-235370-0.service: Deactivated successfully.
Jan 23 10:24:24 compute-1 systemd[1]: systemd-coredump@11-235370-0.service: Consumed 1.131s CPU time.
Jan 23 10:24:24 compute-1 podman[235396]: 2026-01-23 10:24:24.74416399 +0000 UTC m=+0.020296313 container died 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, io.buildah.version=1.40.1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default)
Jan 23 10:24:24 compute-1 systemd[1]: var-lib-containers-storage-overlay-c5e98b19e5f968ae08d4a49d3f82b587ac60eea558f705069f14cfd8b5fe5373-merged.mount: Deactivated successfully.
Jan 23 10:24:24 compute-1 podman[235396]: 2026-01-23 10:24:24.792941119 +0000 UTC m=+0.069073442 container remove 49658bc31e9fbe3f00099ea6fef75632fbb58ac97d76e9fb95bf060c9652206a (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9)
Jan 23 10:24:24 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Main process exited, code=exited, status=139/n/a
Jan 23 10:24:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:24.970 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:24 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Failed with result 'exit-code'.
Jan 23 10:24:24 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.442s CPU time.
Jan 23 10:24:25 compute-1 sudo[235440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:24:25 compute-1 sudo[235440]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:25 compute-1 sudo[235440]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:25 compute-1 ceph-mon[80126]: pgmap v934: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.3 KiB/s wr, 98 op/s
Jan 23 10:24:26 compute-1 nova_compute[225705]: 2026-01-23 10:24:26.444 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:26.487 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:26 compute-1 nova_compute[225705]: 2026-01-23 10:24:26.687 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:24:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:26.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:24:27 compute-1 ceph-mon[80126]: pgmap v935: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.3 KiB/s wr, 98 op/s
Jan 23 10:24:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:28.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:24:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:28.978 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:24:29 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102429 (4) : Server backend/nfs.cephfs.0 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 1 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:24:29 compute-1 ceph-mon[80126]: pgmap v936: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.7 MiB/s rd, 1.2 KiB/s wr, 84 op/s
Jan 23 10:24:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:30.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:30 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:24:30.923 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:24:30 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:24:30.923 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:24:30 compute-1 nova_compute[225705]: 2026-01-23 10:24:30.924 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:30.983 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:31 compute-1 nova_compute[225705]: 2026-01-23 10:24:31.449 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:31 compute-1 nova_compute[225705]: 2026-01-23 10:24:31.689 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:32 compute-1 ceph-mon[80126]: pgmap v937: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Jan 23 10:24:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:32.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:32.985 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:33 compute-1 ceph-mon[80126]: pgmap v938: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 26 op/s
Jan 23 10:24:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:34.499 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:34.989 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:35 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Scheduled restart job, restart counter is at 12.
Jan 23 10:24:35 compute-1 systemd[1]: Stopped Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:24:35 compute-1 systemd[1]: ceph-f3005f84-239a-55b6-a948-8f1fb592b920@nfs.cephfs.0.0.compute-1.bawllm.service: Consumed 1.442s CPU time.
Jan 23 10:24:35 compute-1 systemd[1]: Starting Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920...
Jan 23 10:24:35 compute-1 podman[235518]: 2026-01-23 10:24:35.424372347 +0000 UTC m=+0.050381889 container create 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325)
Jan 23 10:24:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d33a4460d3a4a3015dba4f8c11a57ca82a738fcb8cff5728fc7ff64591543afb/merged/etc/ganesha supports timestamps until 2038 (0x7fffffff)
Jan 23 10:24:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d33a4460d3a4a3015dba4f8c11a57ca82a738fcb8cff5728fc7ff64591543afb/merged/etc/ceph/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:24:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d33a4460d3a4a3015dba4f8c11a57ca82a738fcb8cff5728fc7ff64591543afb/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Jan 23 10:24:35 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d33a4460d3a4a3015dba4f8c11a57ca82a738fcb8cff5728fc7ff64591543afb/merged/var/lib/ceph/radosgw/ceph-nfs.cephfs.0.0.compute-1.bawllm-rgw/keyring supports timestamps until 2038 (0x7fffffff)
Jan 23 10:24:35 compute-1 podman[235518]: 2026-01-23 10:24:35.494949525 +0000 UTC m=+0.120959067 container init 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:24:35 compute-1 podman[235518]: 2026-01-23 10:24:35.404948873 +0000 UTC m=+0.030958395 image pull aade1b12b8e6196a39b8c83a7f707419487931732368729477a8c2bbcbca1d7c quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec
Jan 23 10:24:35 compute-1 podman[235518]: 2026-01-23 10:24:35.504065719 +0000 UTC m=+0.130075221 container start 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, OSD_FLAVOR=default)
Jan 23 10:24:35 compute-1 bash[235518]: 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6
Jan 23 10:24:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] init_logging :LOG :NULL :LOG: Setting log level for all components to NIV_EVENT
Jan 23 10:24:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] main :MAIN :EVENT :ganesha.nfsd Starting: Ganesha Version 5.9
Jan 23 10:24:35 compute-1 systemd[1]: Started Ceph nfs.cephfs.0.0.compute-1.bawllm for f3005f84-239a-55b6-a948-8f1fb592b920.
Jan 23 10:24:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_set_param_from_conf :NFS STARTUP :EVENT :Configuration file successfully parsed
Jan 23 10:24:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] monitoring_init :NFS STARTUP :EVENT :Init monitoring at 0.0.0.0:9587
Jan 23 10:24:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] fsal_init_fds_limit :MDCACHE LRU :EVENT :Setting the system-imposed limit on FDs to 1048576.
Jan 23 10:24:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :Initializing ID Mapper.
Jan 23 10:24:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] init_server_pkgs :NFS STARTUP :EVENT :ID Mapper successfully initialized.
Jan 23 10:24:35 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:35 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:24:36 compute-1 nova_compute[225705]: 2026-01-23 10:24:36.453 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:36.503 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:36 compute-1 nova_compute[225705]: 2026-01-23 10:24:36.691 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:36.994 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:37 compute-1 ceph-mon[80126]: pgmap v939: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 170 B/s rd, 0 op/s
Jan 23 10:24:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:24:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:38.506 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:38 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:24:38.926 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:24:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:24:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:38.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:24:39 compute-1 ceph-mon[80126]: pgmap v940: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:24:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:40.509 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:40 compute-1 podman[235579]: 2026-01-23 10:24:40.691985664 +0000 UTC m=+0.093576404 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:24:40 compute-1 ceph-mon[80126]: pgmap v941: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:24:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:41.002 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:41 compute-1 nova_compute[225705]: 2026-01-23 10:24:41.456 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:41 compute-1 nova_compute[225705]: 2026-01-23 10:24:41.693 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:42 compute-1 ceph-mon[80126]: pgmap v942: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 426 B/s rd, 85 B/s wr, 0 op/s
Jan 23 10:24:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:24:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:24:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:24:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:42.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:42 compute-1 sshd-session[235607]: Invalid user sol from 45.148.10.240 port 54522
Jan 23 10:24:42 compute-1 sshd-session[235607]: Connection closed by invalid user sol 45.148.10.240 port 54522 [preauth]
Jan 23 10:24:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:43.005 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:43 compute-1 ceph-mon[80126]: pgmap v943: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 853 B/s rd, 426 B/s wr, 1 op/s
Jan 23 10:24:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:44.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:45.019 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:45 compute-1 sudo[235610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:24:45 compute-1 sudo[235610]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:45 compute-1 sudo[235610]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:45 compute-1 ceph-mon[80126]: pgmap v944: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 426 B/s wr, 1 op/s
Jan 23 10:24:46 compute-1 nova_compute[225705]: 2026-01-23 10:24:46.460 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:46.517 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:46 compute-1 nova_compute[225705]: 2026-01-23 10:24:46.695 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:24:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:24:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:24:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:24:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:47.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:48 compute-1 ceph-mon[80126]: pgmap v945: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 426 B/s wr, 1 op/s
Jan 23 10:24:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:48.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:49.026 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:49 compute-1 ceph-mon[80126]: pgmap v946: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Jan 23 10:24:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3038382262' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:24:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3038382262' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:24:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:50.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:24:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:51.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:51 compute-1 nova_compute[225705]: 2026-01-23 10:24:51.492 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:51 compute-1 nova_compute[225705]: 2026-01-23 10:24:51.697 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:24:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:24:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:24:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:24:52 compute-1 ceph-mon[80126]: pgmap v947: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 938 B/s rd, 341 B/s wr, 1 op/s
Jan 23 10:24:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:52.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:24:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:53.034 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:24:53 compute-1 ceph-mon[80126]: pgmap v948: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 341 B/s wr, 1 op/s
Jan 23 10:24:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:54.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:55.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:24:55.055 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:24:55.055 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:24:55.056 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:55 compute-1 sudo[235641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:24:55 compute-1 sudo[235641]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:55 compute-1 sudo[235641]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:55 compute-1 sudo[235672]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:24:55 compute-1 sudo[235672]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:55 compute-1 podman[235658]: 2026-01-23 10:24:55.704958258 +0000 UTC m=+0.087955020 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Jan 23 10:24:55 compute-1 nova_compute[225705]: 2026-01-23 10:24:55.733 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:55 compute-1 nova_compute[225705]: 2026-01-23 10:24:55.733 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:55 compute-1 nova_compute[225705]: 2026-01-23 10:24:55.817 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 10:24:55 compute-1 nova_compute[225705]: 2026-01-23 10:24:55.903 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:55 compute-1 nova_compute[225705]: 2026-01-23 10:24:55.903 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:55 compute-1 nova_compute[225705]: 2026-01-23 10:24:55.911 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 10:24:55 compute-1 nova_compute[225705]: 2026-01-23 10:24:55.911 225709 INFO nova.compute.claims [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Claim successful on node compute-1.ctlplane.example.com
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.014 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:56 compute-1 ceph-mon[80126]: pgmap v949: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:24:56 compute-1 sudo[235672]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:56 compute-1 sudo[235761]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:24:56 compute-1 sudo[235761]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:56 compute-1 sudo[235761]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:56 compute-1 sudo[235786]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 list-networks
Jan 23 10:24:56 compute-1 sudo[235786]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:24:56 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:24:56 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1853192220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.499 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.485s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.507 225709 DEBUG nova.compute.provider_tree [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.524 225709 DEBUG nova.scheduler.client.report [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:24:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:24:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:56.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.547 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.643s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.548 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.664 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.664 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.698 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:24:56 compute-1 sudo[235786]: pam_unix(sudo:session): session closed for user root
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.814 225709 DEBUG nova.policy [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.838 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 10:24:56 compute-1 nova_compute[225705]: 2026-01-23 10:24:56.936 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 10:24:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:24:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:24:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:24:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:24:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:24:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:24:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:57.042 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.094 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.097 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.097 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Creating image(s)
Jan 23 10:24:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1853192220' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:24:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:57 compute-1 ceph-mon[80126]: pgmap v950: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 1.2 KiB/s rd, 1 op/s
Jan 23 10:24:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 10:24:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:57 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.143 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.175 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.213 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.217 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.312 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.314 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.315 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.315 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.354 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.358 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.678 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.320s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.784 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.921 225709 DEBUG nova.objects.instance [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid ee5670f1-f0fa-4c86-855a-ce14c49091ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.937 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.937 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Ensure instance console log exists: /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.938 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.939 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.939 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:24:57 compute-1 nova_compute[225705]: 2026-01-23 10:24:57.945 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Successfully created port: b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:24:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:24:58.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:24:58 compute-1 nova_compute[225705]: 2026-01-23 10:24:58.916 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Successfully updated port: b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:24:58 compute-1 nova_compute[225705]: 2026-01-23 10:24:58.941 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:24:58 compute-1 nova_compute[225705]: 2026-01-23 10:24:58.941 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:24:58 compute-1 nova_compute[225705]: 2026-01-23 10:24:58.941 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:24:59 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:59 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:24:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:24:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:24:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:24:59.045 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:24:59 compute-1 nova_compute[225705]: 2026-01-23 10:24:59.063 225709 DEBUG nova.compute.manager [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:24:59 compute-1 nova_compute[225705]: 2026-01-23 10:24:59.064 225709 DEBUG nova.compute.manager [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing instance network info cache due to event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:24:59 compute-1 nova_compute[225705]: 2026-01-23 10:24:59.064 225709 DEBUG oslo_concurrency.lockutils [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:24:59 compute-1 nova_compute[225705]: 2026-01-23 10:24:59.213 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.041 225709 DEBUG nova.network.neutron [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:25:00 compute-1 ceph-mon[80126]: pgmap v951: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 23 10:25:00 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:00 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.391 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.391 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance network_info: |[{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.392 225709 DEBUG oslo_concurrency.lockutils [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.393 225709 DEBUG nova.network.neutron [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.398 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start _get_guest_xml network_info=[{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encrypted': False, 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encryption_format': None, 'boot_index': 0, 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.405 225709 WARNING nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.410 225709 DEBUG nova.virt.libvirt.host [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.411 225709 DEBUG nova.virt.libvirt.host [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.416 225709 DEBUG nova.virt.libvirt.host [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.416 225709 DEBUG nova.virt.libvirt.host [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.417 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.418 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.418 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.419 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.419 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.420 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.420 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.421 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.421 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.422 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.422 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.423 225709 DEBUG nova.virt.hardware [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.428 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:00.538 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:00 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:25:00 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/207192348' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.947 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.520s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:00 compute-1 nova_compute[225705]: 2026-01-23 10:25:00.997 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.004 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:25:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:01.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:25:01 compute-1 anacron[2198]: Job `cron.monthly' started
Jan 23 10:25:01 compute-1 anacron[2198]: Job `cron.monthly' terminated
Jan 23 10:25:01 compute-1 anacron[2198]: Normal exit (3 jobs run)
Jan 23 10:25:01 compute-1 ceph-mon[80126]: pgmap v952: 353 pgs: 353 active+clean; 41 MiB data, 259 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 op/s
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/207192348' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:25:01 compute-1 ceph-mon[80126]: pgmap v953: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:25:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.496 225709 DEBUG nova.network.neutron [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updated VIF entry in instance network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.497 225709 DEBUG nova.network.neutron [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:25:01 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:25:01 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2995660539' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.501 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.515 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.511s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.517 225709 DEBUG nova.virt.libvirt.vif [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:24:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-201401353',display_name='tempest-TestNetworkBasicOps-server-201401353',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-201401353',id=10,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSiMVclS29CQGuWDzR4wqcRxghXwRX3OnxYcchVIhN1re6S5JbcUZIdPe1ViONpYjthNnTE0ukmKTamuv4VEW3D7ha0cmAwvhq7SF9xxubWvSPNpPeahMeeSWlQXgBMKw==',key_name='tempest-TestNetworkBasicOps-247222292',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-bwgfnixj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:24:57Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ee5670f1-f0fa-4c86-855a-ce14c49091ec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.517 225709 DEBUG nova.network.os_vif_util [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.518 225709 DEBUG nova.network.os_vif_util [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.519 225709 DEBUG nova.objects.instance [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid ee5670f1-f0fa-4c86-855a-ce14c49091ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.701 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.871 225709 DEBUG oslo_concurrency.lockutils [req-89f7e66f-0776-4a5d-b1da-46f211eef818 req-b7fae1a3-d3db-4a11-a198-bf92ef17e3ca 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.874 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] End _get_guest_xml xml=<domain type="kvm">
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <uuid>ee5670f1-f0fa-4c86-855a-ce14c49091ec</uuid>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <name>instance-0000000a</name>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <memory>131072</memory>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <vcpu>1</vcpu>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <nova:name>tempest-TestNetworkBasicOps-server-201401353</nova:name>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <nova:creationTime>2026-01-23 10:25:00</nova:creationTime>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <nova:flavor name="m1.nano">
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <nova:memory>128</nova:memory>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <nova:disk>1</nova:disk>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <nova:swap>0</nova:swap>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <nova:vcpus>1</nova:vcpus>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       </nova:flavor>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <nova:owner>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       </nova:owner>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <nova:ports>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <nova:port uuid="b35832ce-bd22-4306-81e2-4d6c9cc4fb5e">
Jan 23 10:25:01 compute-1 nova_compute[225705]:           <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         </nova:port>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       </nova:ports>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     </nova:instance>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <sysinfo type="smbios">
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <system>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <entry name="manufacturer">RDO</entry>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <entry name="product">OpenStack Compute</entry>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <entry name="serial">ee5670f1-f0fa-4c86-855a-ce14c49091ec</entry>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <entry name="uuid">ee5670f1-f0fa-4c86-855a-ce14c49091ec</entry>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <entry name="family">Virtual Machine</entry>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     </system>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <os>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <boot dev="hd"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <smbios mode="sysinfo"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   </os>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <features>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <vmcoreinfo/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   </features>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <clock offset="utc">
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <timer name="hpet" present="no"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <cpu mode="host-model" match="exact">
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <disk type="network" device="disk">
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk">
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       </source>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <target dev="vda" bus="virtio"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <disk type="network" device="cdrom">
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config">
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       </source>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:25:01 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <target dev="sda" bus="sata"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <interface type="ethernet">
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <mac address="fa:16:3e:52:5a:1e"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <mtu size="1442"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <target dev="tapb35832ce-bd"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <serial type="pty">
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <log file="/var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/console.log" append="off"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <video>
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     </video>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <input type="tablet" bus="usb"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <rng model="virtio">
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <backend model="random">/dev/urandom</backend>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <controller type="usb" index="0"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     <memballoon model="virtio">
Jan 23 10:25:01 compute-1 nova_compute[225705]:       <stats period="10"/>
Jan 23 10:25:01 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:25:01 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:25:01 compute-1 nova_compute[225705]: </domain>
Jan 23 10:25:01 compute-1 nova_compute[225705]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.876 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Preparing to wait for external event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.876 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.876 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.877 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.878 225709 DEBUG nova.virt.libvirt.vif [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:24:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-201401353',display_name='tempest-TestNetworkBasicOps-server-201401353',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-201401353',id=10,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSiMVclS29CQGuWDzR4wqcRxghXwRX3OnxYcchVIhN1re6S5JbcUZIdPe1ViONpYjthNnTE0ukmKTamuv4VEW3D7ha0cmAwvhq7SF9xxubWvSPNpPeahMeeSWlQXgBMKw==',key_name='tempest-TestNetworkBasicOps-247222292',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-bwgfnixj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:24:57Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ee5670f1-f0fa-4c86-855a-ce14c49091ec,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.878 225709 DEBUG nova.network.os_vif_util [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.879 225709 DEBUG nova.network.os_vif_util [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.879 225709 DEBUG os_vif [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.880 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.880 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.881 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.885 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.885 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb35832ce-bd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.885 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb35832ce-bd, col_values=(('external_ids', {'iface-id': 'b35832ce-bd22-4306-81e2-4d6c9cc4fb5e', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:52:5a:1e', 'vm-uuid': 'ee5670f1-f0fa-4c86-855a-ce14c49091ec'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:01 compute-1 NetworkManager[48978]: <info>  [1769163901.9141] manager: (tapb35832ce-bd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/51)
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.913 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.917 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.925 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:01 compute-1 nova_compute[225705]: 2026-01-23 10:25:01.926 225709 INFO os_vif [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd')
Jan 23 10:25:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:02 compute-1 ceph-mon[80126]: Health check failed: 1 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 10:25:02 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2995660539' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.384 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.385 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.385 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:52:5a:1e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.386 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Using config drive
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.425 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:02.543 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.790 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Creating config drive at /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.799 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx0qe8u6x execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.933 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpx0qe8u6x" returned: 0 in 0.134s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.975 225709 DEBUG nova.storage.rbd_utils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:25:02 compute-1 nova_compute[225705]: 2026-01-23 10:25:02.979 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:03.053 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:03 compute-1 ceph-mon[80126]: pgmap v954: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.534 225709 DEBUG oslo_concurrency.processutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config ee5670f1-f0fa-4c86-855a-ce14c49091ec_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.556s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.536 225709 INFO nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Deleting local config drive /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec/disk.config because it was imported into RBD.
Jan 23 10:25:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:04.546 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:04 compute-1 systemd[1]: Starting libvirt secret daemon...
Jan 23 10:25:04 compute-1 systemd[1]: Started libvirt secret daemon.
Jan 23 10:25:04 compute-1 kernel: tapb35832ce-bd: entered promiscuous mode
Jan 23 10:25:04 compute-1 NetworkManager[48978]: <info>  [1769163904.6639] manager: (tapb35832ce-bd): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.665 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:04 compute-1 ovn_controller[133293]: 2026-01-23T10:25:04Z|00076|binding|INFO|Claiming lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e for this chassis.
Jan 23 10:25:04 compute-1 ovn_controller[133293]: 2026-01-23T10:25:04Z|00077|binding|INFO|b35832ce-bd22-4306-81e2-4d6c9cc4fb5e: Claiming fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.669 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.676 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:04 compute-1 systemd-machined[194551]: New machine qemu-4-instance-0000000a.
Jan 23 10:25:04 compute-1 systemd-udevd[236158]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:25:04 compute-1 systemd[1]: Started Virtual Machine qemu-4-instance-0000000a.
Jan 23 10:25:04 compute-1 NetworkManager[48978]: <info>  [1769163904.7291] device (tapb35832ce-bd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:25:04 compute-1 NetworkManager[48978]: <info>  [1769163904.7307] device (tapb35832ce-bd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.744 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:04 compute-1 ovn_controller[133293]: 2026-01-23T10:25:04Z|00078|binding|INFO|Setting lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e ovn-installed in OVS
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.751 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.861 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:5a:1e 10.100.0.7'], port_security=['fa:16:3e:52:5a:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ee5670f1-f0fa-4c86-855a-ce14c49091ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '7e7149f7-1f80-4cb0-a07a-4ad2ce209150', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79ce8e9f-5595-435d-b3c2-9a811b1982a6, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.862 143098 INFO neutron.agent.ovn.metadata.agent [-] Port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e in datapath a36237e8-b709-4a50-8f8b-9cccdf12f329 bound to our chassis
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.864 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a36237e8-b709-4a50-8f8b-9cccdf12f329
Jan 23 10:25:04 compute-1 ovn_controller[133293]: 2026-01-23T10:25:04Z|00079|binding|INFO|Setting lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e up in Southbound
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.879 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1db16338-ab18-4957-ae9e-93716c4f5d30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.880 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa36237e8-b1 in ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.882 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa36237e8-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.883 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[90181eed-e79e-406a-b5c8-83373724f982]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.884 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[459b8a71-f361-4e2b-a658-7b1b9006dbba]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.897 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871
Jan 23 10:25:04 compute-1 nova_compute[225705]: 2026-01-23 10:25:04.898 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.905 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[758c2a9e-5765-4a8e-9cdf-a4ed31264132]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.935 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[6bf7f991-165d-4915-9cb1-b1e535fe235c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.978 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c879266d-c140-44e5-b77d-f538d7bfbfb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:04 compute-1 NetworkManager[48978]: <info>  [1769163904.9871] manager: (tapa36237e8-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Jan 23 10:25:04 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:04.988 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[17ddea73-6656-4000-bcc4-ff3facaa1d03]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:04 compute-1 systemd-udevd[236160]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.028 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[86f7d7b4-d051-4079-a7e4-614e37c1151e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.032 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[dc50a87f-144b-468f-a86d-504cdd8cf88b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:25:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:05.056 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:25:05 compute-1 NetworkManager[48978]: <info>  [1769163905.0636] device (tapa36237e8-b0): carrier: link connected
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.073 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[33bafcb1-52a9-461b-8e33-623de263b84c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.097 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[7778783e-da58-4e97-b397-fd8deb3028e1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa36237e8-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:d6:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505904, 'reachable_time': 41458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236216, 'error': None, 'target': 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.114 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[40572e1b-2a47-47e6-a8cf-f810a80719f1]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:d64c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 505904, 'tstamp': 505904}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 236226, 'error': None, 'target': 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.135 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c3f9098b-6dfd-4028-badf-be0b97414586]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa36237e8-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:d6:4c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505904, 'reachable_time': 41458, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 236229, 'error': None, 'target': 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.170 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0083f40e-46d2-4422-a3c5-32cfbb24a25f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.252 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[08a1fc8e-959e-47d6-958e-56f9f1781e29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.254 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa36237e8-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.254 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.254 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa36237e8-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:05 compute-1 nova_compute[225705]: 2026-01-23 10:25:05.256 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:05 compute-1 NetworkManager[48978]: <info>  [1769163905.2579] manager: (tapa36237e8-b0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Jan 23 10:25:05 compute-1 kernel: tapa36237e8-b0: entered promiscuous mode
Jan 23 10:25:05 compute-1 nova_compute[225705]: 2026-01-23 10:25:05.259 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.260 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa36237e8-b0, col_values=(('external_ids', {'iface-id': 'fa751ede-fa2b-4950-a999-549fdae5ffae'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:05 compute-1 ovn_controller[133293]: 2026-01-23T10:25:05Z|00080|binding|INFO|Releasing lport fa751ede-fa2b-4950-a999-549fdae5ffae from this chassis (sb_readonly=0)
Jan 23 10:25:05 compute-1 nova_compute[225705]: 2026-01-23 10:25:05.261 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:05 compute-1 nova_compute[225705]: 2026-01-23 10:25:05.287 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.288 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a36237e8-b709-4a50-8f8b-9cccdf12f329.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a36237e8-b709-4a50-8f8b-9cccdf12f329.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.288 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0ce9b3ec-cece-40fa-b02a-e07ed9d1e938]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.289 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: global
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     log         /dev/log local0 debug
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     log-tag     haproxy-metadata-proxy-a36237e8-b709-4a50-8f8b-9cccdf12f329
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     user        root
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     group       root
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     maxconn     1024
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     pidfile     /var/lib/neutron/external/pids/a36237e8-b709-4a50-8f8b-9cccdf12f329.pid.haproxy
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     daemon
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: defaults
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     log global
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     mode http
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     option httplog
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     option dontlognull
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     option http-server-close
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     option forwardfor
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     retries                 3
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     timeout http-request    30s
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     timeout connect         30s
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     timeout client          32s
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     timeout server          32s
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     timeout http-keep-alive 30s
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: listen listener
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     bind 169.254.169.254:80
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:     http-request add-header X-OVN-Network-ID a36237e8-b709-4a50-8f8b-9cccdf12f329
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:25:05 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:05.290 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'env', 'PROCESS_TAG=haproxy-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a36237e8-b709-4a50-8f8b-9cccdf12f329.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:25:05 compute-1 sudo[236239]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:25:05 compute-1 sudo[236239]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:25:05 compute-1 sudo[236239]: pam_unix(sudo:session): session closed for user root
Jan 23 10:25:05 compute-1 podman[236287]: 2026-01-23 10:25:05.775876319 +0000 UTC m=+0.080810757 container create b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:25:05 compute-1 systemd[1]: Started libpod-conmon-b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d.scope.
Jan 23 10:25:05 compute-1 podman[236287]: 2026-01-23 10:25:05.733940723 +0000 UTC m=+0.038875221 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:25:05 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:25:05 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/60ee88f003832c2d12671caf45936712538fa9185a4ea2311104f1ecd1bc220b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:25:05 compute-1 podman[236287]: 2026-01-23 10:25:05.8838685 +0000 UTC m=+0.188802988 container init b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Jan 23 10:25:05 compute-1 podman[236287]: 2026-01-23 10:25:05.893340205 +0000 UTC m=+0.198274633 container start b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 10:25:05 compute-1 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [NOTICE]   (236306) : New worker (236308) forked
Jan 23 10:25:05 compute-1 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [NOTICE]   (236306) : Loading success.
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.415 225709 DEBUG nova.compute.manager [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.415 225709 DEBUG oslo_concurrency.lockutils [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.416 225709 DEBUG oslo_concurrency.lockutils [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.416 225709 DEBUG oslo_concurrency.lockutils [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.416 225709 DEBUG nova.compute.manager [req-23341686-5c48-4da3-9eb7-5758a0bde23f req-102d2aa5-a41f-4872-8819-9a8975535b74 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Processing event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 10:25:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:06.549 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.616 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.617 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163906.6154583, ee5670f1-f0fa-4c86-855a-ce14c49091ec => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.618 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] VM Started (Lifecycle Event)
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.624 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.628 225709 INFO nova.virt.libvirt.driver [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance spawned successfully.
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.629 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.636 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.646 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.654 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.654 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.655 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.656 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.657 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.658 225709 DEBUG nova.virt.libvirt.driver [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.666 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.668 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163906.6157992, ee5670f1-f0fa-4c86-855a-ce14c49091ec => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.669 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] VM Paused (Lifecycle Event)
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.705 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.751 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.757 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163906.6229315, ee5670f1-f0fa-4c86-855a-ce14c49091ec => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.757 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] VM Resumed (Lifecycle Event)
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.781 225709 INFO nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Took 9.69 seconds to spawn the instance on the hypervisor.
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.781 225709 DEBUG nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.783 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.795 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.892 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:06 compute-1 nova_compute[225705]: 2026-01-23 10:25:06.913 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:07 compute-1 nova_compute[225705]: 2026-01-23 10:25:07.047 225709 INFO nova.compute.manager [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Took 11.18 seconds to build instance.
Jan 23 10:25:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:07.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:07 compute-1 nova_compute[225705]: 2026-01-23 10:25:07.725 225709 DEBUG oslo_concurrency.lockutils [None req-0c2de304-d2a6-4541-93f1-be3d37963b59 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.991s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:07 compute-1 nova_compute[225705]: 2026-01-23 10:25:07.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:07 compute-1 nova_compute[225705]: 2026-01-23 10:25:07.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:07 compute-1 ceph-mon[80126]: pgmap v955: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 2.1 MiB/s wr, 32 op/s
Jan 23 10:25:07 compute-1 nova_compute[225705]: 2026-01-23 10:25:07.895 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:07 compute-1 nova_compute[225705]: 2026-01-23 10:25:07.895 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:07 compute-1 nova_compute[225705]: 2026-01-23 10:25:07.896 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:07 compute-1 nova_compute[225705]: 2026-01-23 10:25:07.896 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:25:07 compute-1 nova_compute[225705]: 2026-01-23 10:25:07.896 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:25:08 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3677633277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.413 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.516s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.491 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.492 225709 DEBUG nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] skipping disk for instance-0000000a as it does not have a path _get_instance_disk_info_from_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11231
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.511 225709 DEBUG nova.compute.manager [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.512 225709 DEBUG oslo_concurrency.lockutils [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.513 225709 DEBUG oslo_concurrency.lockutils [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.513 225709 DEBUG oslo_concurrency.lockutils [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.514 225709 DEBUG nova.compute.manager [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] No waiting events found dispatching network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.515 225709 WARNING nova.compute.manager [req-b69273c5-47f7-430f-9590-12e9a6d40d9b req-65a373e8-5264-4dc1-8f08-c409ae9bf012 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received unexpected event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e for instance with vm_state active and task_state None.
Jan 23 10:25:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:08.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.741 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.744 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4696MB free_disk=59.967525482177734GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.745 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.746 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.952 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Instance ee5670f1-f0fa-4c86-855a-ce14c49091ec actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.953 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.953 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=59GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:25:08 compute-1 nova_compute[225705]: 2026-01-23 10:25:08.992 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:09.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:25:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3681678963' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:09 compute-1 nova_compute[225705]: 2026-01-23 10:25:09.488 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.496s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:09 compute-1 nova_compute[225705]: 2026-01-23 10:25:09.494 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:25:09 compute-1 ceph-mon[80126]: pgmap v956: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 2.1 MiB/s wr, 43 op/s
Jan 23 10:25:09 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:09 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:25:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3677633277' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:09 compute-1 nova_compute[225705]: 2026-01-23 10:25:09.676 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:25:09 compute-1 nova_compute[225705]: 2026-01-23 10:25:09.721 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:25:09 compute-1 nova_compute[225705]: 2026-01-23 10:25:09.722 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.977s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:10 compute-1 sudo[236370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:25:10 compute-1 sudo[236370]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:25:10 compute-1 sudo[236370]: pam_unix(sudo:session): session closed for user root
Jan 23 10:25:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:10.554 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:10 compute-1 nova_compute[225705]: 2026-01-23 10:25:10.723 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:10 compute-1 nova_compute[225705]: 2026-01-23 10:25:10.725 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:10 compute-1 nova_compute[225705]: 2026-01-23 10:25:10.725 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:25:10 compute-1 nova_compute[225705]: 2026-01-23 10:25:10.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:10 compute-1 nova_compute[225705]: 2026-01-23 10:25:10.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:10 compute-1 ceph-mon[80126]: pgmap v957: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 2.1 MiB/s wr, 43 op/s
Jan 23 10:25:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1377665100' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3681678963' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3204813948' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:10 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:25:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1091720736' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:11.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:11 compute-1 nova_compute[225705]: 2026-01-23 10:25:11.707 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:11 compute-1 podman[236396]: 2026-01-23 10:25:11.755905763 +0000 UTC m=+0.145010715 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 10:25:11 compute-1 nova_compute[225705]: 2026-01-23 10:25:11.916 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:11 compute-1 ovn_controller[133293]: 2026-01-23T10:25:11Z|00081|binding|INFO|Releasing lport fa751ede-fa2b-4950-a999-549fdae5ffae from this chassis (sb_readonly=0)
Jan 23 10:25:11 compute-1 NetworkManager[48978]: <info>  [1769163911.9557] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Jan 23 10:25:11 compute-1 NetworkManager[48978]: <info>  [1769163911.9567] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Jan 23 10:25:11 compute-1 nova_compute[225705]: 2026-01-23 10:25:11.957 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:11 compute-1 ovn_controller[133293]: 2026-01-23T10:25:11Z|00082|binding|INFO|Releasing lport fa751ede-fa2b-4950-a999-549fdae5ffae from this chassis (sb_readonly=0)
Jan 23 10:25:11 compute-1 nova_compute[225705]: 2026-01-23 10:25:11.989 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:11 compute-1 nova_compute[225705]: 2026-01-23 10:25:11.998 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:12 compute-1 nova_compute[225705]: 2026-01-23 10:25:12.216 225709 DEBUG nova.compute.manager [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:25:12 compute-1 nova_compute[225705]: 2026-01-23 10:25:12.217 225709 DEBUG nova.compute.manager [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing instance network info cache due to event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:25:12 compute-1 nova_compute[225705]: 2026-01-23 10:25:12.218 225709 DEBUG oslo_concurrency.lockutils [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:25:12 compute-1 nova_compute[225705]: 2026-01-23 10:25:12.219 225709 DEBUG oslo_concurrency.lockutils [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:25:12 compute-1 nova_compute[225705]: 2026-01-23 10:25:12.219 225709 DEBUG nova.network.neutron [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:25:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:12.556 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1971763186' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:12 compute-1 nova_compute[225705]: 2026-01-23 10:25:12.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #58. Immutable memtables: 0.
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.017589) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 33] Flushing memtable with next log file: 58
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913017690, "job": 33, "event": "flush_started", "num_memtables": 1, "num_entries": 1267, "num_deletes": 251, "total_data_size": 3164328, "memory_usage": 3226864, "flush_reason": "Manual Compaction"}
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 33] Level-0 flush table #59: started
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913032436, "cf_name": "default", "job": 33, "event": "table_file_creation", "file_number": 59, "file_size": 2012832, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 29862, "largest_seqno": 31124, "table_properties": {"data_size": 2007176, "index_size": 2987, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12560, "raw_average_key_size": 20, "raw_value_size": 1995757, "raw_average_value_size": 3234, "num_data_blocks": 128, "num_entries": 617, "num_filter_entries": 617, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163822, "oldest_key_time": 1769163822, "file_creation_time": 1769163913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 59, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 33] Flush lasted 14917 microseconds, and 7019 cpu microseconds.
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032520) [db/flush_job.cc:967] [default] [JOB 33] Level-0 flush table #59: 2012832 bytes OK
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.032548) [db/memtable_list.cc:519] [default] Level-0 commit table #59 started
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.034267) [db/memtable_list.cc:722] [default] Level-0 commit table #59: memtable #1 done
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.034287) EVENT_LOG_v1 {"time_micros": 1769163913034280, "job": 33, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.034312) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 33] Try to delete WAL files size 3158232, prev total WAL file size 3158232, number of live WAL files 2.
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000055.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.035704) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032323539' seq:72057594037927935, type:22 .. '7061786F730032353131' seq:0, type:0; will stop at (end)
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 34] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 33 Base level 0, inputs: [59(1965KB)], [57(12MB)]
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913035792, "job": 34, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [59], "files_L6": [57], "score": -1, "input_data_size": 14774520, "oldest_snapshot_seqno": -1}
Jan 23 10:25:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.016000504s ======
Jan 23 10:25:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:13.071 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.016000504s
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 34] Generated table #60: 5880 keys, 12616933 bytes, temperature: kUnknown
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913125569, "cf_name": "default", "job": 34, "event": "table_file_creation", "file_number": 60, "file_size": 12616933, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12579081, "index_size": 22062, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 14725, "raw_key_size": 152068, "raw_average_key_size": 25, "raw_value_size": 12474171, "raw_average_value_size": 2121, "num_data_blocks": 881, "num_entries": 5880, "num_filter_entries": 5880, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769163913, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 60, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.125848) [db/compaction/compaction_job.cc:1663] [default] [JOB 34] Compacted 1@0 + 1@6 files to L6 => 12616933 bytes
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.128728) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.4 rd, 140.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.9, 12.2 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(13.6) write-amplify(6.3) OK, records in: 6401, records dropped: 521 output_compression: NoCompression
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.128745) EVENT_LOG_v1 {"time_micros": 1769163913128737, "job": 34, "event": "compaction_finished", "compaction_time_micros": 89861, "compaction_time_cpu_micros": 25777, "output_level": 6, "num_output_files": 1, "total_output_size": 12616933, "num_input_records": 6401, "num_output_records": 5880, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000059.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913129179, "job": 34, "event": "table_file_deletion", "file_number": 59}
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000057.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769163913131363, "job": 34, "event": "table_file_deletion", "file_number": 57}
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.035554) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131437) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131446) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131449) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131452) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:25:13.131455) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:25:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:13 compute-1 ceph-mon[80126]: pgmap v958: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 2.1 MiB/s wr, 119 op/s
Jan 23 10:25:13 compute-1 nova_compute[225705]: 2026-01-23 10:25:13.869 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:25:14 compute-1 nova_compute[225705]: 2026-01-23 10:25:14.031 225709 DEBUG nova.network.neutron [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updated VIF entry in instance network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:25:14 compute-1 nova_compute[225705]: 2026-01-23 10:25:14.032 225709 DEBUG nova.network.neutron [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:25:14 compute-1 nova_compute[225705]: 2026-01-23 10:25:14.057 225709 DEBUG oslo_concurrency.lockutils [req-4a3a8fdd-03a7-4a42-8952-7f5a875df9a0 req-c620ae91-2edd-4db8-a24e-70de1993a123 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:25:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:14.560 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:14 compute-1 ceph-mon[80126]: pgmap v959: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:25:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:15.090 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:16.563 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:16 compute-1 nova_compute[225705]: 2026-01-23 10:25:16.710 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:16 compute-1 ceph-mon[80126]: pgmap v960: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:25:16 compute-1 nova_compute[225705]: 2026-01-23 10:25:16.961 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:17.093 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:18.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:18 compute-1 ceph-mon[80126]: pgmap v961: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:25:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:19.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:20 compute-1 ovn_controller[133293]: 2026-01-23T10:25:20Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 10:25:20 compute-1 ovn_controller[133293]: 2026-01-23T10:25:20Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 10:25:20 compute-1 ceph-mon[80126]: pgmap v962: 353 pgs: 353 active+clean; 88 MiB data, 281 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 23 10:25:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:20.568 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:21.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:21 compute-1 nova_compute[225705]: 2026-01-23 10:25:21.713 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:21 compute-1 nova_compute[225705]: 2026-01-23 10:25:21.962 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:22 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:25:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:22.571 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:23.103 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:23 compute-1 ceph-mon[80126]: pgmap v963: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 2.1 MiB/s wr, 123 op/s
Jan 23 10:25:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:25:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:24.574 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:25:25 compute-1 ceph-mon[80126]: pgmap v964: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 297 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 23 10:25:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:25:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:25.107 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:25:25 compute-1 nova_compute[225705]: 2026-01-23 10:25:25.143 225709 INFO nova.compute.manager [None req-06ea0805-e3dd-4ccb-9753-3f247a501c58 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Get console output
Jan 23 10:25:25 compute-1 nova_compute[225705]: 2026-01-23 10:25:25.151 230072 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes
Jan 23 10:25:25 compute-1 sudo[236432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:25:25 compute-1 sudo[236432]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:25:25 compute-1 sudo[236432]: pam_unix(sudo:session): session closed for user root
Jan 23 10:25:25 compute-1 ovn_controller[133293]: 2026-01-23T10:25:25Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 10:25:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:26.578 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:26 compute-1 podman[236457]: 2026-01-23 10:25:26.65938703 +0000 UTC m=+0.051900676 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:25:26 compute-1 nova_compute[225705]: 2026-01-23 10:25:26.716 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:26 compute-1 ceph-mon[80126]: pgmap v965: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 297 KiB/s rd, 2.1 MiB/s wr, 59 op/s
Jan 23 10:25:26 compute-1 nova_compute[225705]: 2026-01-23 10:25:26.964 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:25:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:27.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:25:28 compute-1 ovn_controller[133293]: 2026-01-23T10:25:28Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 10:25:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:28.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:28 compute-1 ceph-mon[80126]: pgmap v966: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 331 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:25:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:29.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:30.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:30 compute-1 ceph-mon[80126]: pgmap v967: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 330 KiB/s rd, 2.1 MiB/s wr, 64 op/s
Jan 23 10:25:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:31.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:31 compute-1 ovn_controller[133293]: 2026-01-23T10:25:31Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:52:5a:1e 10.100.0.7
Jan 23 10:25:31 compute-1 nova_compute[225705]: 2026-01-23 10:25:31.720 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:31 compute-1 nova_compute[225705]: 2026-01-23 10:25:31.967 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.041 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.043 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.044 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.355 225709 DEBUG nova.compute.manager [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.356 225709 DEBUG nova.compute.manager [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing instance network info cache due to event network-changed-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.357 225709 DEBUG oslo_concurrency.lockutils [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.357 225709 DEBUG oslo_concurrency.lockutils [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.357 225709 DEBUG nova.network.neutron [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Refreshing network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.437 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.438 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.438 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.438 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.438 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.440 225709 INFO nova.compute.manager [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Terminating instance
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.441 225709 DEBUG nova.compute.manager [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 10:25:32 compute-1 kernel: tapb35832ce-bd (unregistering): left promiscuous mode
Jan 23 10:25:32 compute-1 NetworkManager[48978]: <info>  [1769163932.4978] device (tapb35832ce-bd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.505 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 ovn_controller[133293]: 2026-01-23T10:25:32Z|00083|binding|INFO|Releasing lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e from this chassis (sb_readonly=0)
Jan 23 10:25:32 compute-1 ovn_controller[133293]: 2026-01-23T10:25:32Z|00084|binding|INFO|Setting lport b35832ce-bd22-4306-81e2-4d6c9cc4fb5e down in Southbound
Jan 23 10:25:32 compute-1 ovn_controller[133293]: 2026-01-23T10:25:32Z|00085|binding|INFO|Removing iface tapb35832ce-bd ovn-installed in OVS
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.507 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.518 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:52:5a:1e 10.100.0.7'], port_security=['fa:16:3e:52:5a:1e 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': 'ee5670f1-f0fa-4c86-855a-ce14c49091ec', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '7e7149f7-1f80-4cb0-a07a-4ad2ce209150', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=79ce8e9f-5595-435d-b3c2-9a811b1982a6, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.520 143098 INFO neutron.agent.ovn.metadata.agent [-] Port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e in datapath a36237e8-b709-4a50-8f8b-9cccdf12f329 unbound from our chassis
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.521 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a36237e8-b709-4a50-8f8b-9cccdf12f329, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.523 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[f107e192-6970-43ff-9fdd-53c01755983b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.523 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 namespace which is not needed anymore
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.528 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Jan 23 10:25:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:32 compute-1 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 14.303s CPU time.
Jan 23 10:25:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:32.588 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:32 compute-1 systemd-machined[194551]: Machine qemu-4-instance-0000000a terminated.
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.669 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.678 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.685 225709 INFO nova.virt.libvirt.driver [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Instance destroyed successfully.
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.685 225709 DEBUG nova.objects.instance [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid ee5670f1-f0fa-4c86-855a-ce14c49091ec obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.709 225709 DEBUG nova.virt.libvirt.vif [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:24:54Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-201401353',display_name='tempest-TestNetworkBasicOps-server-201401353',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-201401353',id=10,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDSiMVclS29CQGuWDzR4wqcRxghXwRX3OnxYcchVIhN1re6S5JbcUZIdPe1ViONpYjthNnTE0ukmKTamuv4VEW3D7ha0cmAwvhq7SF9xxubWvSPNpPeahMeeSWlQXgBMKw==',key_name='tempest-TestNetworkBasicOps-247222292',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:25:06Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-bwgfnixj',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:25:06Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=ee5670f1-f0fa-4c86-855a-ce14c49091ec,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.710 225709 DEBUG nova.network.os_vif_util [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.711 225709 DEBUG nova.network.os_vif_util [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.712 225709 DEBUG os_vif [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.713 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.714 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb35832ce-bd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:32 compute-1 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [NOTICE]   (236306) : haproxy version is 2.8.14-c23fe91
Jan 23 10:25:32 compute-1 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [NOTICE]   (236306) : path to executable is /usr/sbin/haproxy
Jan 23 10:25:32 compute-1 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [WARNING]  (236306) : Exiting Master process...
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.716 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [ALERT]    (236306) : Current worker (236308) exited with code 143 (Terminated)
Jan 23 10:25:32 compute-1 neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329[236302]: [WARNING]  (236306) : All workers exited. Exiting... (0)
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.718 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 systemd[1]: libpod-b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d.scope: Deactivated successfully.
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.724 225709 INFO os_vif [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:52:5a:1e,bridge_name='br-int',has_traffic_filtering=True,id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e,network=Network(a36237e8-b709-4a50-8f8b-9cccdf12f329),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb35832ce-bd')
Jan 23 10:25:32 compute-1 podman[236502]: 2026-01-23 10:25:32.726267878 +0000 UTC m=+0.072432906 container died b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Jan 23 10:25:32 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d-userdata-shm.mount: Deactivated successfully.
Jan 23 10:25:32 compute-1 systemd[1]: var-lib-containers-storage-overlay-60ee88f003832c2d12671caf45936712538fa9185a4ea2311104f1ecd1bc220b-merged.mount: Deactivated successfully.
Jan 23 10:25:32 compute-1 podman[236502]: 2026-01-23 10:25:32.771061842 +0000 UTC m=+0.117226900 container cleanup b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Jan 23 10:25:32 compute-1 systemd[1]: libpod-conmon-b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d.scope: Deactivated successfully.
Jan 23 10:25:32 compute-1 ceph-mon[80126]: pgmap v968: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 331 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:25:32 compute-1 podman[236561]: 2026-01-23 10:25:32.867275107 +0000 UTC m=+0.059046169 container remove b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.874 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3ce0b25c-a7b9-44fb-97fd-08d35bf380ba]: (4, ('Fri Jan 23 10:25:32 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 (b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d)\nb80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d\nFri Jan 23 10:25:32 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 (b80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d)\nb80179e4a3e24f4b89b60dfe7b6f0f6bd20a80b75cbc4c632b135384e45abf3d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.877 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1a29c67e-f622-4270-abc3-9ecfcdc9de1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.879 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa36237e8-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.882 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 kernel: tapa36237e8-b0: left promiscuous mode
Jan 23 10:25:32 compute-1 nova_compute[225705]: 2026-01-23 10:25:32.905 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.911 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0f044922-d05b-4592-968a-64dc236b9457]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.931 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[821f446c-e57f-4d27-b1db-4c411816b48f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.934 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[8d431976-2a11-4123-9eb2-3ba9e72b07c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.957 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[b0832980-2c64-4bcf-a100-36b3cb33aba4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 505895, 'reachable_time': 28738, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 236576, 'error': None, 'target': 'ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.961 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a36237e8-b709-4a50-8f8b-9cccdf12f329 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:25:32 compute-1 systemd[1]: run-netns-ovnmeta\x2da36237e8\x2db709\x2d4a50\x2d8f8b\x2d9cccdf12f329.mount: Deactivated successfully.
Jan 23 10:25:32 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:32.961 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[04d69142-fb67-4453-aa4f-013452e1e9b4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:25:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:33.124 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:33 compute-1 nova_compute[225705]: 2026-01-23 10:25:33.133 225709 INFO nova.virt.libvirt.driver [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Deleting instance files /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec_del
Jan 23 10:25:33 compute-1 nova_compute[225705]: 2026-01-23 10:25:33.134 225709 INFO nova.virt.libvirt.driver [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Deletion of /var/lib/nova/instances/ee5670f1-f0fa-4c86-855a-ce14c49091ec_del complete
Jan 23 10:25:33 compute-1 nova_compute[225705]: 2026-01-23 10:25:33.200 225709 INFO nova.compute.manager [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Took 0.76 seconds to destroy the instance on the hypervisor.
Jan 23 10:25:33 compute-1 nova_compute[225705]: 2026-01-23 10:25:33.200 225709 DEBUG oslo.service.loopingcall [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 10:25:33 compute-1 nova_compute[225705]: 2026-01-23 10:25:33.201 225709 DEBUG nova.compute.manager [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 10:25:33 compute-1 nova_compute[225705]: 2026-01-23 10:25:33.201 225709 DEBUG nova.network.neutron [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 10:25:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:34 compute-1 nova_compute[225705]: 2026-01-23 10:25:34.480 225709 DEBUG nova.compute.manager [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-unplugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:25:34 compute-1 nova_compute[225705]: 2026-01-23 10:25:34.480 225709 DEBUG oslo_concurrency.lockutils [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:34 compute-1 nova_compute[225705]: 2026-01-23 10:25:34.480 225709 DEBUG oslo_concurrency.lockutils [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:34 compute-1 nova_compute[225705]: 2026-01-23 10:25:34.481 225709 DEBUG oslo_concurrency.lockutils [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:34 compute-1 nova_compute[225705]: 2026-01-23 10:25:34.481 225709 DEBUG nova.compute.manager [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] No waiting events found dispatching network-vif-unplugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:25:34 compute-1 nova_compute[225705]: 2026-01-23 10:25:34.481 225709 DEBUG nova.compute.manager [req-e40555f1-0d6c-4877-ad72-48b03038a1ee req-55e8bc93-e5e0-42bf-9673-caec7eb64f8f 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-unplugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 10:25:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:34.592 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:34 compute-1 ceph-mon[80126]: pgmap v969: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 14 KiB/s wr, 6 op/s
Jan 23 10:25:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:35.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:35 compute-1 nova_compute[225705]: 2026-01-23 10:25:35.814 225709 DEBUG nova.network.neutron [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:25:35 compute-1 nova_compute[225705]: 2026-01-23 10:25:35.835 225709 INFO nova.compute.manager [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Took 2.63 seconds to deallocate network for instance.
Jan 23 10:25:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:25:35 compute-1 nova_compute[225705]: 2026-01-23 10:25:35.877 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:35 compute-1 nova_compute[225705]: 2026-01-23 10:25:35.878 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:35 compute-1 nova_compute[225705]: 2026-01-23 10:25:35.931 225709 DEBUG oslo_concurrency.processutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:25:35 compute-1 nova_compute[225705]: 2026-01-23 10:25:35.956 225709 DEBUG nova.network.neutron [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updated VIF entry in instance network info cache for port b35832ce-bd22-4306-81e2-4d6c9cc4fb5e. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:25:35 compute-1 nova_compute[225705]: 2026-01-23 10:25:35.957 225709 DEBUG nova.network.neutron [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [{"id": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "address": "fa:16:3e:52:5a:1e", "network": {"id": "a36237e8-b709-4a50-8f8b-9cccdf12f329", "bridge": "br-int", "label": "tempest-network-smoke--873619526", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb35832ce-bd", "ovs_interfaceid": "b35832ce-bd22-4306-81e2-4d6c9cc4fb5e", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.002 225709 DEBUG oslo_concurrency.lockutils [req-d474b2bb-8eff-46c0-b4b1-7b9b52ce2441 req-d71bccb4-43e8-419b-a4de-6ec7862450ea 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-ee5670f1-f0fa-4c86-855a-ce14c49091ec" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:25:36 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:25:36 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/976421879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.396 225709 DEBUG oslo_concurrency.processutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.466s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.405 225709 DEBUG nova.compute.provider_tree [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.421 225709 DEBUG nova.scheduler.client.report [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.472 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.594s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.495 225709 INFO nova.scheduler.client.report [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance ee5670f1-f0fa-4c86-855a-ce14c49091ec
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.561 225709 DEBUG nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.562 225709 DEBUG oslo_concurrency.lockutils [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.563 225709 DEBUG oslo_concurrency.lockutils [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.563 225709 DEBUG oslo_concurrency.lockutils [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.563 225709 DEBUG nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] No waiting events found dispatching network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.563 225709 WARNING nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received unexpected event network-vif-plugged-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e for instance with vm_state deleted and task_state None.
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.564 225709 DEBUG nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Received event network-vif-deleted-b35832ce-bd22-4306-81e2-4d6c9cc4fb5e external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.564 225709 INFO nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Neutron deleted interface b35832ce-bd22-4306-81e2-4d6c9cc4fb5e; detaching it from the instance and deleting it from the info cache
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.564 225709 DEBUG nova.network.neutron [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.568 225709 DEBUG oslo_concurrency.lockutils [None req-03ad258d-d137-4fb6-a96e-64d28214ab34 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "ee5670f1-f0fa-4c86-855a-ce14c49091ec" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.130s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.590 225709 DEBUG nova.compute.manager [req-4db4d73f-78ae-4b9f-a2e8-9543c350e970 req-539da929-a627-4e78-89d8-13c49dd28ccb 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Detach interface failed, port_id=b35832ce-bd22-4306-81e2-4d6c9cc4fb5e, reason: Instance ee5670f1-f0fa-4c86-855a-ce14c49091ec could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 10:25:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:36.595 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:36 compute-1 nova_compute[225705]: 2026-01-23 10:25:36.722 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:36 compute-1 ceph-mon[80126]: pgmap v970: 353 pgs: 353 active+clean; 121 MiB data, 310 MiB used, 60 GiB / 60 GiB avail; 34 KiB/s rd, 14 KiB/s wr, 6 op/s
Jan 23 10:25:36 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/976421879' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:37.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:37 compute-1 nova_compute[225705]: 2026-01-23 10:25:37.718 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:38.598 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:38 compute-1 ceph-mon[80126]: pgmap v971: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 54 KiB/s rd, 15 KiB/s wr, 34 op/s
Jan 23 10:25:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:39.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:39 compute-1 nova_compute[225705]: 2026-01-23 10:25:39.869 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:39 compute-1 nova_compute[225705]: 2026-01-23 10:25:39.985 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:25:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:40.602 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:25:40 compute-1 ceph-mon[80126]: pgmap v972: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 15 KiB/s wr, 28 op/s
Jan 23 10:25:41 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:41.047 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:25:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:41.140 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:41 compute-1 nova_compute[225705]: 2026-01-23 10:25:41.726 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:42.606 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:42 compute-1 podman[236606]: 2026-01-23 10:25:42.715777125 +0000 UTC m=+0.110607195 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:25:42 compute-1 nova_compute[225705]: 2026-01-23 10:25:42.720 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:43 compute-1 ceph-mon[80126]: pgmap v973: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 15 KiB/s wr, 29 op/s
Jan 23 10:25:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:25:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:43.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:25:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:44 compute-1 ceph-mon[80126]: pgmap v974: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:25:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:44.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 10:25:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:45.146 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 10:25:45 compute-1 sudo[236635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:25:45 compute-1 sudo[236635]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:25:45 compute-1 sudo[236635]: pam_unix(sudo:session): session closed for user root
Jan 23 10:25:46 compute-1 ceph-mon[80126]: pgmap v975: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:25:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:25:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:46.611 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:25:46 compute-1 nova_compute[225705]: 2026-01-23 10:25:46.776 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:47.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:47 compute-1 nova_compute[225705]: 2026-01-23 10:25:47.684 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769163932.682029, ee5670f1-f0fa-4c86-855a-ce14c49091ec => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:25:47 compute-1 nova_compute[225705]: 2026-01-23 10:25:47.684 225709 INFO nova.compute.manager [-] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] VM Stopped (Lifecycle Event)
Jan 23 10:25:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [WARNING] 022/102547 (4) : Server backend/nfs.cephfs.2 is DOWN, reason: Layer4 connection problem, info: "Connection refused", check duration: 0ms. 0 active and 0 backup servers left. 0 sessions active, 0 requeued, 0 remaining in queue.
Jan 23 10:25:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm[85333]: [ALERT] 022/102547 (4) : backend 'backend' has no server available!
Jan 23 10:25:47 compute-1 nova_compute[225705]: 2026-01-23 10:25:47.707 225709 DEBUG nova.compute.manager [None req-8ce47605-8964-4f06-9c06-e36a4c265841 - - - - - -] [instance: ee5670f1-f0fa-4c86-855a-ce14c49091ec] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:25:47 compute-1 nova_compute[225705]: 2026-01-23 10:25:47.724 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:48 compute-1 ceph-mon[80126]: pgmap v976: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:25:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:48.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:49.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/384635064' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:25:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/384635064' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:25:50 compute-1 ceph-mon[80126]: pgmap v977: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:25:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:25:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:50.617 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:51.159 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:51 compute-1 nova_compute[225705]: 2026-01-23 10:25:51.779 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:52 compute-1 ceph-mon[80126]: pgmap v978: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:25:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:52.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:52 compute-1 nova_compute[225705]: 2026-01-23 10:25:52.726 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:53.162 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:54 compute-1 ceph-mon[80126]: pgmap v979: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:25:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4284066152' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:25:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:54.623 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:55.056 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:25:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:55.057 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:25:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:25:55.057 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:25:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:55.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:56 compute-1 ceph-mon[80126]: pgmap v980: 353 pgs: 353 active+clean; 41 MiB data, 263 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:25:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:25:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:56.627 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:25:56 compute-1 nova_compute[225705]: 2026-01-23 10:25:56.781 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:25:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:25:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:25:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:25:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:25:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:57.170 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:57 compute-1 podman[236666]: 2026-01-23 10:25:57.675774478 +0000 UTC m=+0.078821824 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:25:57 compute-1 nova_compute[225705]: 2026-01-23 10:25:57.755 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:25:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:25:58.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:25:58 compute-1 ceph-mon[80126]: pgmap v981: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:25:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1766427025' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:25:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:25:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:25:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:25:59.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:25:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2805498511' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:26:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:00.634 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:00 compute-1 ceph-mon[80126]: pgmap v982: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:26:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:01.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:01 compute-1 nova_compute[225705]: 2026-01-23 10:26:01.784 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:01 compute-1 nova_compute[225705]: 2026-01-23 10:26:01.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:01 compute-1 nova_compute[225705]: 2026-01-23 10:26:01.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:26:01 compute-1 nova_compute[225705]: 2026-01-23 10:26:01.891 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:26:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:02 compute-1 ceph-mon[80126]: pgmap v983: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:26:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:02.636 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:02 compute-1 nova_compute[225705]: 2026-01-23 10:26:02.757 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:03.182 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:04 compute-1 ceph-mon[80126]: pgmap v984: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:26:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:04.640 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:26:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:05.186 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:26:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:26:05 compute-1 sudo[236690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:26:05 compute-1 sudo[236690]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:05 compute-1 sudo[236690]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:05 compute-1 nova_compute[225705]: 2026-01-23 10:26:05.891 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:05 compute-1 nova_compute[225705]: 2026-01-23 10:26:05.892 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:26:05 compute-1 nova_compute[225705]: 2026-01-23 10:26:05.892 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:26:06 compute-1 nova_compute[225705]: 2026-01-23 10:26:06.444 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:26:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:06.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:06 compute-1 nova_compute[225705]: 2026-01-23 10:26:06.785 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:07 compute-1 ceph-mon[80126]: pgmap v985: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 25 KiB/s rd, 1.8 MiB/s wr, 37 op/s
Jan 23 10:26:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:26:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:07.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:26:07 compute-1 nova_compute[225705]: 2026-01-23 10:26:07.766 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:08 compute-1 nova_compute[225705]: 2026-01-23 10:26:08.420 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:08 compute-1 ceph-mon[80126]: pgmap v986: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 1.8 MiB/s wr, 101 op/s
Jan 23 10:26:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:08.648 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:08 compute-1 nova_compute[225705]: 2026-01-23 10:26:08.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:08 compute-1 nova_compute[225705]: 2026-01-23 10:26:08.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:09.194 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1720359449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:09 compute-1 nova_compute[225705]: 2026-01-23 10:26:09.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:09 compute-1 nova_compute[225705]: 2026-01-23 10:26:09.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:09 compute-1 nova_compute[225705]: 2026-01-23 10:26:09.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:09 compute-1 nova_compute[225705]: 2026-01-23 10:26:09.897 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:09 compute-1 nova_compute[225705]: 2026-01-23 10:26:09.898 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:26:09 compute-1 nova_compute[225705]: 2026-01-23 10:26:09.898 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:10 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:26:10 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2269378824' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:10 compute-1 nova_compute[225705]: 2026-01-23 10:26:10.417 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.519s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:10 compute-1 ceph-mon[80126]: pgmap v987: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:26:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/80523321' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1268700137' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2269378824' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:10 compute-1 nova_compute[225705]: 2026-01-23 10:26:10.626 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:26:10 compute-1 nova_compute[225705]: 2026-01-23 10:26:10.628 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4894MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:26:10 compute-1 nova_compute[225705]: 2026-01-23 10:26:10.628 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:10 compute-1 nova_compute[225705]: 2026-01-23 10:26:10.628 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:10.650 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:10 compute-1 sudo[236740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:26:10 compute-1 sudo[236740]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:10 compute-1 sudo[236740]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:10 compute-1 nova_compute[225705]: 2026-01-23 10:26:10.737 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:26:10 compute-1 nova_compute[225705]: 2026-01-23 10:26:10.738 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:26:10 compute-1 nova_compute[225705]: 2026-01-23 10:26:10.757 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:10 compute-1 sudo[236765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 10:26:10 compute-1 sudo[236765]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:26:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:11.198 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:26:11 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:26:11 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/304457320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:11 compute-1 nova_compute[225705]: 2026-01-23 10:26:11.270 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:11 compute-1 nova_compute[225705]: 2026-01-23 10:26:11.278 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:26:11 compute-1 nova_compute[225705]: 2026-01-23 10:26:11.305 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:26:11 compute-1 nova_compute[225705]: 2026-01-23 10:26:11.329 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:26:11 compute-1 nova_compute[225705]: 2026-01-23 10:26:11.330 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.701s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:11 compute-1 nova_compute[225705]: 2026-01-23 10:26:11.331 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:11 compute-1 nova_compute[225705]: 2026-01-23 10:26:11.331 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:26:11 compute-1 podman[236887]: 2026-01-23 10:26:11.454314705 +0000 UTC m=+0.075720728 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_REF=squid, org.label-schema.build-date=20250325, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/)
Jan 23 10:26:11 compute-1 podman[236887]: 2026-01-23 10:26:11.565171716 +0000 UTC m=+0.186577659 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, io.buildah.version=1.40.1, ceph=True, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_REF=squid, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 10:26:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/311252267' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/304457320' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:11 compute-1 nova_compute[225705]: 2026-01-23 10:26:11.788 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:12 compute-1 podman[237004]: 2026-01-23 10:26:12.082560202 +0000 UTC m=+0.068441532 container exec 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:26:12 compute-1 podman[237004]: 2026-01-23 10:26:12.100025655 +0000 UTC m=+0.085907005 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:26:12 compute-1 nova_compute[225705]: 2026-01-23 10:26:12.344 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:12 compute-1 nova_compute[225705]: 2026-01-23 10:26:12.345 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:12 compute-1 nova_compute[225705]: 2026-01-23 10:26:12.345 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:12 compute-1 nova_compute[225705]: 2026-01-23 10:26:12.345 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:26:12 compute-1 podman[237096]: 2026-01-23 10:26:12.533058275 +0000 UTC m=+0.071501296 container exec 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.build-date=20250325, OSD_FLAVOR=default, io.buildah.version=1.40.1, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:26:12 compute-1 podman[237096]: 2026-01-23 10:26:12.548767885 +0000 UTC m=+0.087210906 container exec_died 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1)
Jan 23 10:26:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:12.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:12 compute-1 nova_compute[225705]: 2026-01-23 10:26:12.769 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:12 compute-1 podman[237161]: 2026-01-23 10:26:12.839885657 +0000 UTC m=+0.068873586 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 10:26:12 compute-1 podman[237161]: 2026-01-23 10:26:12.848288709 +0000 UTC m=+0.077276628 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 10:26:12 compute-1 ceph-mon[80126]: pgmap v988: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:26:13 compute-1 podman[237198]: 2026-01-23 10:26:13.027803617 +0000 UTC m=+0.105068692 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 10:26:13 compute-1 podman[237250]: 2026-01-23 10:26:13.159360082 +0000 UTC m=+0.076483292 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, architecture=x86_64, io.buildah.version=1.28.2, description=keepalived for Ceph, build-date=2023-02-22T09:23:20, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, release=1793, io.openshift.expose-services=, com.redhat.component=keepalived-container, name=keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9)
Jan 23 10:26:13 compute-1 podman[237250]: 2026-01-23 10:26:13.177113354 +0000 UTC m=+0.094236594 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, io.openshift.expose-services=, build-date=2023-02-22T09:23:20, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, version=2.2.4, release=1793, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides keepalived on RHEL 9 for Ceph., description=keepalived for Ceph, com.redhat.component=keepalived-container, name=keepalived, io.openshift.tags=Ceph keepalived, io.k8s.display-name=Keepalived on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, io.buildah.version=1.28.2, vcs-type=git, vendor=Red Hat, Inc.)
Jan 23 10:26:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:13.201 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:13 compute-1 sudo[236765]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:13 compute-1 nova_compute[225705]: 2026-01-23 10:26:13.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.300 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.301 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.319 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.375 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.376 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.382 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.383 225709 INFO nova.compute.claims [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Claim successful on node compute-1.ctlplane.example.com
Jan 23 10:26:14 compute-1 ceph-mon[80126]: pgmap v989: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.514 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:14 compute-1 sudo[237284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:26:14 compute-1 sudo[237284]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:14 compute-1 sudo[237284]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:14 compute-1 sudo[237310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:26:14 compute-1 sudo[237310]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:14.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:26:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:26:14 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1174453001' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:14 compute-1 nova_compute[225705]: 2026-01-23 10:26:14.997 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.483s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.009 225709 DEBUG nova.compute.provider_tree [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.057 225709 DEBUG nova.scheduler.client.report [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.083 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.084 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.126 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.126 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.151 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.172 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Jan 23 10:26:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:15.205 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:15 compute-1 sudo[237310]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.264 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.265 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.266 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Creating image(s)
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.298 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.333 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.370 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.374 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.466 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.468 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "379b2821245bc82aa5a95839eddb9a97716b559c" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.469 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.469 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "379b2821245bc82aa5a95839eddb9a97716b559c" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.507 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.513 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:15 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:15 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1174453001' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:26:15 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:26:15 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:26:15 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.805 225709 DEBUG nova.policy [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'f459c4e71e6c47acb0f8aaf83f34695e', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.808 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/379b2821245bc82aa5a95839eddb9a97716b559c 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.295s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:15 compute-1 nova_compute[225705]: 2026-01-23 10:26:15.882 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] resizing rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Jan 23 10:26:16 compute-1 nova_compute[225705]: 2026-01-23 10:26:16.000 225709 DEBUG nova.objects.instance [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'migration_context' on Instance uuid 7ca81dd2-d692-41ed-99b0-3046f49353ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:26:16 compute-1 nova_compute[225705]: 2026-01-23 10:26:16.018 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Jan 23 10:26:16 compute-1 nova_compute[225705]: 2026-01-23 10:26:16.019 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Ensure instance console log exists: /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Jan 23 10:26:16 compute-1 nova_compute[225705]: 2026-01-23 10:26:16.020 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:16 compute-1 nova_compute[225705]: 2026-01-23 10:26:16.021 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:16 compute-1 nova_compute[225705]: 2026-01-23 10:26:16.021 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:16 compute-1 ovn_controller[133293]: 2026-01-23T10:26:16Z|00086|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Jan 23 10:26:16 compute-1 ceph-mon[80126]: pgmap v990: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 64 op/s
Jan 23 10:26:16 compute-1 ceph-mon[80126]: pgmap v991: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 2.2 MiB/s rd, 75 op/s
Jan 23 10:26:16 compute-1 ceph-mon[80126]: pgmap v992: 353 pgs: 353 active+clean; 88 MiB data, 285 MiB used, 60 GiB / 60 GiB avail; 372 B/s rd, 0 op/s
Jan 23 10:26:16 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:16 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:26:16 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:26:16 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:26:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:16.660 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:16 compute-1 nova_compute[225705]: 2026-01-23 10:26:16.791 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:17.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:17 compute-1 nova_compute[225705]: 2026-01-23 10:26:17.551 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Successfully created port: a8ceb3e7-8c43-461e-b444-6492e841b540 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548
Jan 23 10:26:17 compute-1 nova_compute[225705]: 2026-01-23 10:26:17.772 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:18 compute-1 ceph-mon[80126]: pgmap v993: 353 pgs: 353 active+clean; 167 MiB data, 325 MiB used, 60 GiB / 60 GiB avail; 562 KiB/s rd, 5.7 MiB/s wr, 131 op/s
Jan 23 10:26:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:18.664 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:18 compute-1 nova_compute[225705]: 2026-01-23 10:26:18.943 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Successfully updated port: a8ceb3e7-8c43-461e-b444-6492e841b540 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Jan 23 10:26:18 compute-1 nova_compute[225705]: 2026-01-23 10:26:18.959 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:26:18 compute-1 nova_compute[225705]: 2026-01-23 10:26:18.960 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquired lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:26:18 compute-1 nova_compute[225705]: 2026-01-23 10:26:18.960 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Jan 23 10:26:19 compute-1 nova_compute[225705]: 2026-01-23 10:26:19.037 225709 DEBUG nova.compute.manager [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:19 compute-1 nova_compute[225705]: 2026-01-23 10:26:19.037 225709 DEBUG nova.compute.manager [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing instance network info cache due to event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:26:19 compute-1 nova_compute[225705]: 2026-01-23 10:26:19.038 225709 DEBUG oslo_concurrency.lockutils [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:26:19 compute-1 nova_compute[225705]: 2026-01-23 10:26:19.124 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Jan 23 10:26:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:19.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:20 compute-1 sudo[237558]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:26:20 compute-1 sudo[237558]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:20 compute-1 sudo[237558]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.257 225709 DEBUG nova.network.neutron [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.278 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Releasing lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.278 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance network_info: |[{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.278 225709 DEBUG oslo_concurrency.lockutils [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.279 225709 DEBUG nova.network.neutron [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.281 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start _get_guest_xml network_info=[{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'device_name': '/dev/vda', 'encryption_options': None, 'encrypted': False, 'guest_format': None, 'device_type': 'disk', 'size': 0, 'encryption_format': None, 'boot_index': 0, 'image_id': '271ec98e-d058-421b-bbfb-4b4a5954c90a'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.286 225709 WARNING nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.294 225709 DEBUG nova.virt.libvirt.host [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.294 225709 DEBUG nova.virt.libvirt.host [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.306 225709 DEBUG nova.virt.libvirt.host [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Searching host: 'compute-1.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.306 225709 DEBUG nova.virt.libvirt.host [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.307 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.307 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-01-23T10:15:35Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='1d8c8bf4-786e-4009-bc53-f259480fb5b3',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-01-23T10:15:36Z,direct_url=<?>,disk_format='qcow2',id=271ec98e-d058-421b-bbfb-4b4a5954c90a,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='5220cd4f58cb43bb899e367e961bc5c1',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-01-23T10:15:40Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.307 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.308 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.309 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.309 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.309 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.309 225709 DEBUG nova.virt.hardware [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.312 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:20.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:20 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:26:20 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1766751452' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.814 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.847 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:26:20 compute-1 nova_compute[225705]: 2026-01-23 10:26:20.851 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:20 compute-1 ceph-mon[80126]: pgmap v994: 353 pgs: 353 active+clean; 167 MiB data, 325 MiB used, 60 GiB / 60 GiB avail; 562 KiB/s rd, 5.7 MiB/s wr, 131 op/s
Jan 23 10:26:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:26:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:26:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1766751452' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:26:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:21.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.217 225709 DEBUG nova.network.neutron [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updated VIF entry in instance network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.218 225709 DEBUG nova.network.neutron [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.237 225709 DEBUG oslo_concurrency.lockutils [req-eb2ad63c-b8d3-4042-8460-c09c4449af78 req-fc34149b-8cd4-4d80-a8b8-b737dff224ff 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:26:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Jan 23 10:26:21 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1512136197' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.303 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.305 225709 DEBUG nova.virt.libvirt.vif [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:26:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-693116214',display_name='tempest-TestNetworkBasicOps-server-693116214',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-693116214',id=12,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJjlNRjQ7HampCGTP/XTbCk9R3Ib+fDj6CfNlH4m79pVD9aYufMedp8ud5j/BRBY25VTiRpd/PxmVnv+wizUD3d3aoKtzcmvEyogkbp0bOIKJAePE4aMxKhzUKychHG7bA==',key_name='tempest-TestNetworkBasicOps-2017035378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-nxpw4fzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:15Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=7ca81dd2-d692-41ed-99b0-3046f49353ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.306 225709 DEBUG nova.network.os_vif_util [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.307 225709 DEBUG nova.network.os_vif_util [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.309 225709 DEBUG nova.objects.instance [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7ca81dd2-d692-41ed-99b0-3046f49353ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.322 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] End _get_guest_xml xml=<domain type="kvm">
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <uuid>7ca81dd2-d692-41ed-99b0-3046f49353ac</uuid>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <name>instance-0000000c</name>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <memory>131072</memory>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <vcpu>1</vcpu>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <metadata>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <nova:name>tempest-TestNetworkBasicOps-server-693116214</nova:name>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <nova:creationTime>2026-01-23 10:26:20</nova:creationTime>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <nova:flavor name="m1.nano">
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <nova:memory>128</nova:memory>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <nova:disk>1</nova:disk>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <nova:swap>0</nova:swap>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <nova:ephemeral>0</nova:ephemeral>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <nova:vcpus>1</nova:vcpus>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       </nova:flavor>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <nova:owner>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <nova:user uuid="f459c4e71e6c47acb0f8aaf83f34695e">tempest-TestNetworkBasicOps-655467240-project-member</nova:user>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <nova:project uuid="acc90003f0f7412b8daf8a1b6f0f1494">tempest-TestNetworkBasicOps-655467240</nova:project>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       </nova:owner>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <nova:root type="image" uuid="271ec98e-d058-421b-bbfb-4b4a5954c90a"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <nova:ports>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <nova:port uuid="a8ceb3e7-8c43-461e-b444-6492e841b540">
Jan 23 10:26:21 compute-1 nova_compute[225705]:           <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         </nova:port>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       </nova:ports>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     </nova:instance>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   </metadata>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <sysinfo type="smbios">
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <system>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <entry name="manufacturer">RDO</entry>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <entry name="product">OpenStack Compute</entry>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <entry name="serial">7ca81dd2-d692-41ed-99b0-3046f49353ac</entry>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <entry name="uuid">7ca81dd2-d692-41ed-99b0-3046f49353ac</entry>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <entry name="family">Virtual Machine</entry>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     </system>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   </sysinfo>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <os>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <type arch="x86_64" machine="q35">hvm</type>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <boot dev="hd"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <smbios mode="sysinfo"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   </os>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <features>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <acpi/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <apic/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <vmcoreinfo/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   </features>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <clock offset="utc">
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <timer name="pit" tickpolicy="delay"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <timer name="rtc" tickpolicy="catchup"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <timer name="hpet" present="no"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   </clock>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <cpu mode="host-model" match="exact">
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <topology sockets="1" cores="1" threads="1"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   </cpu>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   <devices>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <disk type="network" device="disk">
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/7ca81dd2-d692-41ed-99b0-3046f49353ac_disk">
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       </source>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <target dev="vda" bus="virtio"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <disk type="network" device="cdrom">
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <driver type="raw" cache="none"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <source protocol="rbd" name="vms/7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config">
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <host name="192.168.122.100" port="6789"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <host name="192.168.122.102" port="6789"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <host name="192.168.122.101" port="6789"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       </source>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <auth username="openstack">
Jan 23 10:26:21 compute-1 nova_compute[225705]:         <secret type="ceph" uuid="f3005f84-239a-55b6-a948-8f1fb592b920"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       </auth>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <target dev="sda" bus="sata"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     </disk>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <interface type="ethernet">
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <mac address="fa:16:3e:69:ef:f5"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <driver name="vhost" rx_queue_size="512"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <mtu size="1442"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <target dev="tapa8ceb3e7-8c"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     </interface>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <serial type="pty">
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <log file="/var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/console.log" append="off"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     </serial>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <video>
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <model type="virtio"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     </video>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <input type="tablet" bus="usb"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <rng model="virtio">
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <backend model="random">/dev/urandom</backend>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     </rng>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="pci" model="pcie-root-port"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <controller type="usb" index="0"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     <memballoon model="virtio">
Jan 23 10:26:21 compute-1 nova_compute[225705]:       <stats period="10"/>
Jan 23 10:26:21 compute-1 nova_compute[225705]:     </memballoon>
Jan 23 10:26:21 compute-1 nova_compute[225705]:   </devices>
Jan 23 10:26:21 compute-1 nova_compute[225705]: </domain>
Jan 23 10:26:21 compute-1 nova_compute[225705]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.323 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Preparing to wait for external event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.324 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.325 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.325 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.326 225709 DEBUG nova.virt.libvirt.vif [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-01-23T10:26:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-693116214',display_name='tempest-TestNetworkBasicOps-server-693116214',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-693116214',id=12,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJjlNRjQ7HampCGTP/XTbCk9R3Ib+fDj6CfNlH4m79pVD9aYufMedp8ud5j/BRBY25VTiRpd/PxmVnv+wizUD3d3aoKtzcmvEyogkbp0bOIKJAePE4aMxKhzUKychHG7bA==',key_name='tempest-TestNetworkBasicOps-2017035378',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-nxpw4fzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-01-23T10:26:15Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=7ca81dd2-d692-41ed-99b0-3046f49353ac,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.327 225709 DEBUG nova.network.os_vif_util [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.328 225709 DEBUG nova.network.os_vif_util [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.329 225709 DEBUG os_vif [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.330 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.330 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.331 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.335 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.336 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa8ceb3e7-8c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.337 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa8ceb3e7-8c, col_values=(('external_ids', {'iface-id': 'a8ceb3e7-8c43-461e-b444-6492e841b540', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:69:ef:f5', 'vm-uuid': '7ca81dd2-d692-41ed-99b0-3046f49353ac'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:21 compute-1 NetworkManager[48978]: <info>  [1769163981.3402] manager: (tapa8ceb3e7-8c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.342 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.346 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.347 225709 INFO os_vif [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c')
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.403 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.403 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.404 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] No VIF found with MAC fa:16:3e:69:ef:f5, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.405 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Using config drive
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.444 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.794 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.903 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Creating config drive at /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config
Jan 23 10:26:21 compute-1 nova_compute[225705]: 2026-01-23 10:26:21.913 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8j_z160s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:21 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1512136197' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:26:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.054 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp8j_z160s" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.099 225709 DEBUG nova.storage.rbd_utils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] rbd image 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.104 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.293 225709 DEBUG oslo_concurrency.processutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config 7ca81dd2-d692-41ed-99b0-3046f49353ac_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.189s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.295 225709 INFO nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Deleting local config drive /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac/disk.config because it was imported into RBD.
Jan 23 10:26:22 compute-1 kernel: tapa8ceb3e7-8c: entered promiscuous mode
Jan 23 10:26:22 compute-1 NetworkManager[48978]: <info>  [1769163982.3699] manager: (tapa8ceb3e7-8c): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.370 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:22 compute-1 ovn_controller[133293]: 2026-01-23T10:26:22Z|00087|binding|INFO|Claiming lport a8ceb3e7-8c43-461e-b444-6492e841b540 for this chassis.
Jan 23 10:26:22 compute-1 ovn_controller[133293]: 2026-01-23T10:26:22Z|00088|binding|INFO|a8ceb3e7-8c43-461e-b444-6492e841b540: Claiming fa:16:3e:69:ef:f5 10.100.0.9
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.385 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:22 compute-1 NetworkManager[48978]: <info>  [1769163982.3920] manager: (patch-provnet-995e8c2d-ca55-405c-bf26-97e408875e42-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/59)
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.391 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:22 compute-1 NetworkManager[48978]: <info>  [1769163982.3938] manager: (patch-br-int-to-provnet-995e8c2d-ca55-405c-bf26-97e408875e42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.395 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:ef:f5 10.100.0.9'], port_security=['fa:16:3e:69:ef:f5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7ca81dd2-d692-41ed-99b0-3046f49353ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-712c0ef6-fbbe-4577-b44d-9610116b414a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0542887a-7598-408a-a342-24bd8aead651', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3547f5ca-ca7c-4ba0-a5f8-3ad2055eb8ec, chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=a8ceb3e7-8c43-461e-b444-6492e841b540) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.398 143098 INFO neutron.agent.ovn.metadata.agent [-] Port a8ceb3e7-8c43-461e-b444-6492e841b540 in datapath 712c0ef6-fbbe-4577-b44d-9610116b414a bound to our chassis
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.400 143098 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 10:26:22 compute-1 systemd-machined[194551]: New machine qemu-5-instance-0000000c.
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.416 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d64b232a-6ace-498c-8dbb-ad4c9d559e90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.418 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap712c0ef6-f1 in ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.419 229898 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap712c0ef6-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.419 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[0eb36935-0b8c-40b3-a28f-a6eb5e5662b8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.420 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1fdf73fd-bf5a-4072-b2e2-895483ecf520]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.435 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[6161e87d-74b8-41d8-8299-5f60043ed618]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.464 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[e8939828-0b37-4fae-b92d-5910f26e94ba]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 systemd[1]: Started Virtual Machine qemu-5-instance-0000000c.
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.492 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.500 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.502 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[52ac4734-6d95-4d0b-bba5-63e3105d7b53]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_controller[133293]: 2026-01-23T10:26:22Z|00089|binding|INFO|Setting lport a8ceb3e7-8c43-461e-b444-6492e841b540 up in Southbound
Jan 23 10:26:22 compute-1 ovn_controller[133293]: 2026-01-23T10:26:22Z|00090|binding|INFO|Setting lport a8ceb3e7-8c43-461e-b444-6492e841b540 ovn-installed in OVS
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.508 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[d45908ef-460c-4e0f-bca5-5c82eeb2c8ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.509 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:22 compute-1 NetworkManager[48978]: <info>  [1769163982.5100] manager: (tap712c0ef6-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/61)
Jan 23 10:26:22 compute-1 systemd-udevd[237724]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:26:22 compute-1 systemd-udevd[237726]: Network interface NamePolicy= disabled on kernel command line.
Jan 23 10:26:22 compute-1 NetworkManager[48978]: <info>  [1769163982.5377] device (tapa8ceb3e7-8c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Jan 23 10:26:22 compute-1 NetworkManager[48978]: <info>  [1769163982.5384] device (tapa8ceb3e7-8c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.547 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[1ef83f9c-ad0f-4518-8939-4d75a9d0df28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.550 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[c0eda2cc-0df4-4327-883d-3c3e4aa0fd26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 NetworkManager[48978]: <info>  [1769163982.5760] device (tap712c0ef6-f0): carrier: link connected
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.583 229913 DEBUG oslo.privsep.daemon [-] privsep: reply[4bba78c6-3c81-4bd3-84cc-d609dcc8a172]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.609 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3fdc4bb3-67d3-4c99-a321-11617e059807]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap712c0ef6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:ec:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513655, 'reachable_time': 32308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 237751, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.631 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad098e4-33a4-4725-b9de-e4d1dc05aeed]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed0:ec06'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 513655, 'tstamp': 513655}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 237752, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.665 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[1c3258b7-4677-45a6-9438-a49cbc4e1f50]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap712c0ef6-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d0:ec:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 176, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513655, 'reachable_time': 32308, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 237753, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:22.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.696 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[dda21ece-7e99-48e0-9dc6-dec403cc344a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.756 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[39db4172-d722-4ade-bb7b-3f7ec12d293c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.757 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap712c0ef6-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.757 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.757 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap712c0ef6-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:22 compute-1 kernel: tap712c0ef6-f0: entered promiscuous mode
Jan 23 10:26:22 compute-1 NetworkManager[48978]: <info>  [1769163982.7602] manager: (tap712c0ef6-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.759 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.766 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap712c0ef6-f0, col_values=(('external_ids', {'iface-id': '6c333384-cae4-4f40-8b56-257e8d961c46'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:22 compute-1 ovn_controller[133293]: 2026-01-23T10:26:22Z|00091|binding|INFO|Releasing lport 6c333384-cae4-4f40-8b56-257e8d961c46 from this chassis (sb_readonly=0)
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.768 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.780 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.780 143098 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.781 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[86395a1f-84c3-4815-8ee5-7fc17f9d6b0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.782 143098 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: global
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     log         /dev/log local0 debug
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     log-tag     haproxy-metadata-proxy-712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     user        root
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     group       root
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     maxconn     1024
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     pidfile     /var/lib/neutron/external/pids/712c0ef6-fbbe-4577-b44d-9610116b414a.pid.haproxy
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     daemon
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: defaults
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     log global
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     mode http
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     option httplog
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     option dontlognull
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     option http-server-close
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     option forwardfor
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     retries                 3
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     timeout http-request    30s
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     timeout connect         30s
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     timeout client          32s
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     timeout server          32s
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     timeout http-keep-alive 30s
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: listen listener
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     bind 169.254.169.254:80
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     server metadata /var/lib/neutron/metadata_proxy
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:     http-request add-header X-OVN-Network-ID 712c0ef6-fbbe-4577-b44d-9610116b414a
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Jan 23 10:26:22 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:22.782 143098 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'env', 'PROCESS_TAG=haproxy-712c0ef6-fbbe-4577-b44d-9610116b414a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/712c0ef6-fbbe-4577-b44d-9610116b414a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.852 225709 DEBUG nova.compute.manager [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.853 225709 DEBUG oslo_concurrency.lockutils [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.854 225709 DEBUG oslo_concurrency.lockutils [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.854 225709 DEBUG oslo_concurrency.lockutils [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:22 compute-1 nova_compute[225705]: 2026-01-23 10:26:22.855 225709 DEBUG nova.compute.manager [req-e3531ebd-2431-42ed-b8d5-68053c70d7ad req-94f96f4d-8352-48ca-a9b6-d60493db22c7 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Processing event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Jan 23 10:26:22 compute-1 ceph-mon[80126]: pgmap v995: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 567 KiB/s rd, 5.7 MiB/s wr, 135 op/s
Jan 23 10:26:23 compute-1 podman[237786]: 2026-01-23 10:26:23.201389894 +0000 UTC m=+0.078261727 container create 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:26:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:23.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:23 compute-1 systemd[1]: Started libpod-conmon-0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297.scope.
Jan 23 10:26:23 compute-1 podman[237786]: 2026-01-23 10:26:23.164891308 +0000 UTC m=+0.041763211 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2
Jan 23 10:26:23 compute-1 systemd[1]: Started libcrun container.
Jan 23 10:26:23 compute-1 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7f38b4c9d64371d54aa104ef7a398b75b4fd9331f2fa9bb8717afb31f7a935f2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Jan 23 10:26:23 compute-1 podman[237786]: 2026-01-23 10:26:23.308331322 +0000 UTC m=+0.185203255 container init 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Jan 23 10:26:23 compute-1 podman[237786]: 2026-01-23 10:26:23.317994814 +0000 UTC m=+0.194866667 container start 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:26:23 compute-1 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [NOTICE]   (237803) : New worker (237818) forked
Jan 23 10:26:23 compute-1 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [NOTICE]   (237803) : Loading success.
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.519 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163983.5185375, 7ca81dd2-d692-41ed-99b0-3046f49353ac => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.519 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] VM Started (Lifecycle Event)
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.522 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.526 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.530 225709 INFO nova.virt.libvirt.driver [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance spawned successfully.
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.531 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.542 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.547 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.555 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.556 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.557 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.557 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.558 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.558 225709 DEBUG nova.virt.libvirt.driver [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.568 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.569 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163983.5186481, 7ca81dd2-d692-41ed-99b0-3046f49353ac => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.569 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] VM Paused (Lifecycle Event)
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.591 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.594 225709 DEBUG nova.virt.driver [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] Emitting event <LifecycleEvent: 1769163983.5250778, 7ca81dd2-d692-41ed-99b0-3046f49353ac => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.595 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] VM Resumed (Lifecycle Event)
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.626 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.630 225709 DEBUG nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.634 225709 INFO nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Took 8.37 seconds to spawn the instance on the hypervisor.
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.635 225709 DEBUG nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.663 225709 INFO nova.compute.manager [None req-56b0df72-2634-413b-956f-b362cf1dabb4 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] During sync_power_state the instance has a pending task (spawning). Skip.
Jan 23 10:26:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.697 225709 INFO nova.compute.manager [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Took 9.34 seconds to build instance.
Jan 23 10:26:23 compute-1 nova_compute[225705]: 2026-01-23 10:26:23.733 225709 DEBUG oslo_concurrency.lockutils [None req-8d8a1703-ef9f-4095-9e5b-05da4bc94831 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.432s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:24 compute-1 ceph-mon[80126]: pgmap v996: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 567 KiB/s rd, 5.7 MiB/s wr, 135 op/s
Jan 23 10:26:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:24.672 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:24 compute-1 nova_compute[225705]: 2026-01-23 10:26:24.936 225709 DEBUG nova.compute.manager [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:24 compute-1 nova_compute[225705]: 2026-01-23 10:26:24.936 225709 DEBUG oslo_concurrency.lockutils [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:24 compute-1 nova_compute[225705]: 2026-01-23 10:26:24.937 225709 DEBUG oslo_concurrency.lockutils [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:24 compute-1 nova_compute[225705]: 2026-01-23 10:26:24.937 225709 DEBUG oslo_concurrency.lockutils [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:24 compute-1 nova_compute[225705]: 2026-01-23 10:26:24.938 225709 DEBUG nova.compute.manager [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] No waiting events found dispatching network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:26:24 compute-1 nova_compute[225705]: 2026-01-23 10:26:24.938 225709 WARNING nova.compute.manager [req-aecda461-dfa3-4c62-beee-12d6813c18e0 req-e79db28b-d0d4-4944-a8bd-1e5abffd98d6 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received unexpected event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 for instance with vm_state active and task_state None.
Jan 23 10:26:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:25.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:25 compute-1 sudo[237858]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:26:25 compute-1 sudo[237858]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:25 compute-1 sudo[237858]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:26 compute-1 ceph-mon[80126]: pgmap v997: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 467 KiB/s rd, 4.7 MiB/s wr, 112 op/s
Jan 23 10:26:26 compute-1 nova_compute[225705]: 2026-01-23 10:26:26.342 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:26.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:26 compute-1 nova_compute[225705]: 2026-01-23 10:26:26.796 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:27.222 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:28 compute-1 ceph-mon[80126]: pgmap v998: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 2.3 MiB/s rd, 3.9 MiB/s wr, 167 op/s
Jan 23 10:26:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:26:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:28.698 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:26:28 compute-1 podman[237884]: 2026-01-23 10:26:28.753806446 +0000 UTC m=+0.144326593 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 10:26:28 compute-1 nova_compute[225705]: 2026-01-23 10:26:28.910 225709 DEBUG nova.compute.manager [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:28 compute-1 nova_compute[225705]: 2026-01-23 10:26:28.911 225709 DEBUG nova.compute.manager [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing instance network info cache due to event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:26:28 compute-1 nova_compute[225705]: 2026-01-23 10:26:28.911 225709 DEBUG oslo_concurrency.lockutils [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:26:28 compute-1 nova_compute[225705]: 2026-01-23 10:26:28.911 225709 DEBUG oslo_concurrency.lockutils [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:26:28 compute-1 nova_compute[225705]: 2026-01-23 10:26:28.911 225709 DEBUG nova.network.neutron [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:26:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:29.225 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:30 compute-1 nova_compute[225705]: 2026-01-23 10:26:30.004 225709 DEBUG nova.network.neutron [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updated VIF entry in instance network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:26:30 compute-1 nova_compute[225705]: 2026-01-23 10:26:30.005 225709 DEBUG nova.network.neutron [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:26:30 compute-1 nova_compute[225705]: 2026-01-23 10:26:30.272 225709 DEBUG oslo_concurrency.lockutils [req-3cdeb35c-848d-4f27-912e-0e6f08d2b63d req-236fa041-050e-48b4-9361-811da9b52c48 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:26:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:30.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:30 compute-1 ceph-mon[80126]: pgmap v999: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 77 op/s
Jan 23 10:26:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:31.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:31 compute-1 nova_compute[225705]: 2026-01-23 10:26:31.346 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:31 compute-1 nova_compute[225705]: 2026-01-23 10:26:31.800 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:32 compute-1 ceph-mon[80126]: pgmap v1000: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 25 KiB/s wr, 77 op/s
Jan 23 10:26:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:32.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:26:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:33.232 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:26:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:34 compute-1 ceph-mon[80126]: pgmap v1001: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:26:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:34.708 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:35.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:26:36 compute-1 nova_compute[225705]: 2026-01-23 10:26:36.350 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:26:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:36.711 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:26:36 compute-1 nova_compute[225705]: 2026-01-23 10:26:36.802 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:36 compute-1 ceph-mon[80126]: pgmap v1002: 353 pgs: 353 active+clean; 167 MiB data, 335 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:26:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:37 compute-1 ovn_controller[133293]: 2026-01-23T10:26:37Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:69:ef:f5 10.100.0.9
Jan 23 10:26:37 compute-1 ovn_controller[133293]: 2026-01-23T10:26:37Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:69:ef:f5 10.100.0.9
Jan 23 10:26:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:37.240 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:38.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:38 compute-1 ceph-mon[80126]: pgmap v1003: 353 pgs: 353 active+clean; 188 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 2.1 MiB/s rd, 2.0 MiB/s wr, 117 op/s
Jan 23 10:26:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:39.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:40 compute-1 ceph-mon[80126]: pgmap v1004: 353 pgs: 353 active+clean; 188 MiB data, 348 MiB used, 60 GiB / 60 GiB avail; 237 KiB/s rd, 2.0 MiB/s wr, 43 op/s
Jan 23 10:26:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:40.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:26:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:41.247 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:26:41 compute-1 nova_compute[225705]: 2026-01-23 10:26:41.353 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:41 compute-1 nova_compute[225705]: 2026-01-23 10:26:41.806 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:42 compute-1 ceph-mon[80126]: pgmap v1005: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Jan 23 10:26:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:42.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:43.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:43 compute-1 podman[237912]: 2026-01-23 10:26:43.703740198 +0000 UTC m=+0.104784302 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 10:26:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:44 compute-1 ceph-mon[80126]: pgmap v1006: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:26:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:44.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:45.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:45 compute-1 sudo[237939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:26:46 compute-1 nova_compute[225705]: 2026-01-23 10:26:46.356 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:46 compute-1 sudo[237939]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:26:46 compute-1 sudo[237939]: pam_unix(sudo:session): session closed for user root
Jan 23 10:26:46 compute-1 ceph-mon[80126]: pgmap v1007: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 391 KiB/s rd, 2.1 MiB/s wr, 65 op/s
Jan 23 10:26:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:46.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:46 compute-1 nova_compute[225705]: 2026-01-23 10:26:46.843 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:47 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:46.999 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:26:47 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:47.001 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:26:47 compute-1 nova_compute[225705]: 2026-01-23 10:26:47.001 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:47.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:48 compute-1 ceph-mon[80126]: pgmap v1008: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 392 KiB/s rd, 2.1 MiB/s wr, 66 op/s
Jan 23 10:26:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:48.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:49.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2133716442' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:26:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2133716442' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:26:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:50.733 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:51 compute-1 ceph-mon[80126]: pgmap v1009: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 155 KiB/s rd, 107 KiB/s wr, 23 op/s
Jan 23 10:26:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:26:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:26:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:51.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:26:51 compute-1 nova_compute[225705]: 2026-01-23 10:26:51.360 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:51 compute-1 nova_compute[225705]: 2026-01-23 10:26:51.847 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:52 compute-1 ceph-mon[80126]: pgmap v1010: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 162 KiB/s rd, 112 KiB/s wr, 24 op/s
Jan 23 10:26:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:52.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:53.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.382 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.382 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.383 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.383 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.383 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.384 225709 INFO nova.compute.manager [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Terminating instance
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.385 225709 DEBUG nova.compute.manager [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Jan 23 10:26:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.874 225709 DEBUG nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.875 225709 DEBUG nova.compute.manager [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing instance network info cache due to event network-changed-a8ceb3e7-8c43-461e-b444-6492e841b540. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.875 225709 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.875 225709 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquired lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Jan 23 10:26:53 compute-1 nova_compute[225705]: 2026-01-23 10:26:53.876 225709 DEBUG nova.network.neutron [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Refreshing network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Jan 23 10:26:54 compute-1 kernel: tapa8ceb3e7-8c (unregistering): left promiscuous mode
Jan 23 10:26:54 compute-1 NetworkManager[48978]: <info>  [1769164014.4634] device (tapa8ceb3e7-8c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Jan 23 10:26:54 compute-1 ovn_controller[133293]: 2026-01-23T10:26:54Z|00092|binding|INFO|Releasing lport a8ceb3e7-8c43-461e-b444-6492e841b540 from this chassis (sb_readonly=0)
Jan 23 10:26:54 compute-1 ovn_controller[133293]: 2026-01-23T10:26:54Z|00093|binding|INFO|Setting lport a8ceb3e7-8c43-461e-b444-6492e841b540 down in Southbound
Jan 23 10:26:54 compute-1 ovn_controller[133293]: 2026-01-23T10:26:54Z|00094|binding|INFO|Removing iface tapa8ceb3e7-8c ovn-installed in OVS
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.473 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.476 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.502 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:69:ef:f5 10.100.0.9'], port_security=['fa:16:3e:69:ef:f5 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-1.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '7ca81dd2-d692-41ed-99b0-3046f49353ac', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-712c0ef6-fbbe-4577-b44d-9610116b414a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'acc90003f0f7412b8daf8a1b6f0f1494', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0542887a-7598-408a-a342-24bd8aead651', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-1.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3547f5ca-ca7c-4ba0-a5f8-3ad2055eb8ec, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>], logical_port=a8ceb3e7-8c43-461e-b444-6492e841b540) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f31a2f66640>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.503 143098 INFO neutron.agent.ovn.metadata.agent [-] Port a8ceb3e7-8c43-461e-b444-6492e841b540 in datapath 712c0ef6-fbbe-4577-b44d-9610116b414a unbound from our chassis
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.504 143098 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 712c0ef6-fbbe-4577-b44d-9610116b414a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.507 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[587f6daf-c1a1-4f38-bc25-f1ecf54530f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.508 143098 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a namespace which is not needed anymore
Jan 23 10:26:54 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Jan 23 10:26:54 compute-1 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Consumed 14.819s CPU time.
Jan 23 10:26:54 compute-1 systemd-machined[194551]: Machine qemu-5-instance-0000000c terminated.
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.620 225709 INFO nova.virt.libvirt.driver [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Instance destroyed successfully.
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.621 225709 DEBUG nova.objects.instance [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lazy-loading 'resources' on Instance uuid 7ca81dd2-d692-41ed-99b0-3046f49353ac obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.649 225709 DEBUG nova.virt.libvirt.vif [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-01-23T10:26:12Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-693116214',display_name='tempest-TestNetworkBasicOps-server-693116214',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-693116214',id=12,image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJjlNRjQ7HampCGTP/XTbCk9R3Ib+fDj6CfNlH4m79pVD9aYufMedp8ud5j/BRBY25VTiRpd/PxmVnv+wizUD3d3aoKtzcmvEyogkbp0bOIKJAePE4aMxKhzUKychHG7bA==',key_name='tempest-TestNetworkBasicOps-2017035378',keypairs=<?>,launch_index=0,launched_at=2026-01-23T10:26:23Z,launched_on='compute-1.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-1.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='acc90003f0f7412b8daf8a1b6f0f1494',ramdisk_id='',reservation_id='r-nxpw4fzd',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='271ec98e-d058-421b-bbfb-4b4a5954c90a',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-655467240',owner_user_name='tempest-TestNetworkBasicOps-655467240-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-01-23T10:26:23Z,user_data=None,user_id='f459c4e71e6c47acb0f8aaf83f34695e',uuid=7ca81dd2-d692-41ed-99b0-3046f49353ac,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.649 225709 DEBUG nova.network.os_vif_util [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converting VIF {"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.194", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.650 225709 DEBUG nova.network.os_vif_util [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.650 225709 DEBUG os_vif [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.652 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 23 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.652 225709 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa8ceb3e7-8c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.653 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.655 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-1 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [NOTICE]   (237803) : haproxy version is 2.8.14-c23fe91
Jan 23 10:26:54 compute-1 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [NOTICE]   (237803) : path to executable is /usr/sbin/haproxy
Jan 23 10:26:54 compute-1 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [WARNING]  (237803) : Exiting Master process...
Jan 23 10:26:54 compute-1 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [WARNING]  (237803) : Exiting Master process...
Jan 23 10:26:54 compute-1 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [ALERT]    (237803) : Current worker (237818) exited with code 143 (Terminated)
Jan 23 10:26:54 compute-1 neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a[237799]: [WARNING]  (237803) : All workers exited. Exiting... (0)
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.658 225709 INFO os_vif [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:69:ef:f5,bridge_name='br-int',has_traffic_filtering=True,id=a8ceb3e7-8c43-461e-b444-6492e841b540,network=Network(712c0ef6-fbbe-4577-b44d-9610116b414a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa8ceb3e7-8c')
Jan 23 10:26:54 compute-1 systemd[1]: libpod-0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297.scope: Deactivated successfully.
Jan 23 10:26:54 compute-1 podman[237993]: 2026-01-23 10:26:54.667198122 +0000 UTC m=+0.057348865 container died 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:26:54 compute-1 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297-userdata-shm.mount: Deactivated successfully.
Jan 23 10:26:54 compute-1 systemd[1]: var-lib-containers-storage-overlay-7f38b4c9d64371d54aa104ef7a398b75b4fd9331f2fa9bb8717afb31f7a935f2-merged.mount: Deactivated successfully.
Jan 23 10:26:54 compute-1 podman[237993]: 2026-01-23 10:26:54.70952779 +0000 UTC m=+0.099678533 container cleanup 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:26:54 compute-1 systemd[1]: libpod-conmon-0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297.scope: Deactivated successfully.
Jan 23 10:26:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:54.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:54 compute-1 podman[238052]: 2026-01-23 10:26:54.779371594 +0000 UTC m=+0.047609853 container remove 0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.784 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[c46d2b15-b1d9-49b1-8e2a-b8faafebf4d6]: (4, ('Fri Jan 23 10:26:54 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a (0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297)\n0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297\nFri Jan 23 10:26:54 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a (0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297)\n0e33070e4a72783d414daa071ba5be2971b2af2838309cb967cdef419e712297\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.787 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[6d01806c-f6f7-40de-b21c-192177a8de8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.788 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap712c0ef6-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.790 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-1 ceph-mon[80126]: pgmap v1011: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 17 KiB/s wr, 2 op/s
Jan 23 10:26:54 compute-1 kernel: tap712c0ef6-f0: left promiscuous mode
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.804 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-1 nova_compute[225705]: 2026-01-23 10:26:54.806 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.809 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[2af8715b-cd4f-430a-91a8-0e9436dc3580]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.826 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb88e39-8077-4468-9c82-300e59d8b804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.827 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[28ee66bd-82c7-4535-91b1-49a65d828838]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.842 229898 DEBUG oslo.privsep.daemon [-] privsep: reply[b3d08f0f-464f-4d92-87e8-521f93915fd4]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 513647, 'reachable_time': 39515, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 238067, 'error': None, 'target': 'ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:54 compute-1 systemd[1]: run-netns-ovnmeta\x2d712c0ef6\x2dfbbe\x2d4577\x2db44d\x2d9610116b414a.mount: Deactivated successfully.
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.849 143216 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-712c0ef6-fbbe-4577-b44d-9610116b414a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Jan 23 10:26:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:54.850 143216 DEBUG oslo.privsep.daemon [-] privsep: reply[47bad867-d72a-4c77-a8e5-76be91908433]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Jan 23 10:26:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:55.057 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:55 compute-1 nova_compute[225705]: 2026-01-23 10:26:55.112 225709 DEBUG nova.network.neutron [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updated VIF entry in instance network info cache for port a8ceb3e7-8c43-461e-b444-6492e841b540. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Jan 23 10:26:55 compute-1 nova_compute[225705]: 2026-01-23 10:26:55.113 225709 DEBUG nova.network.neutron [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [{"id": "a8ceb3e7-8c43-461e-b444-6492e841b540", "address": "fa:16:3e:69:ef:f5", "network": {"id": "712c0ef6-fbbe-4577-b44d-9610116b414a", "bridge": "br-int", "label": "tempest-network-smoke--493794687", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "acc90003f0f7412b8daf8a1b6f0f1494", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa8ceb3e7-8c", "ovs_interfaceid": "a8ceb3e7-8c43-461e-b444-6492e841b540", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:26:55 compute-1 nova_compute[225705]: 2026-01-23 10:26:55.163 225709 DEBUG oslo_concurrency.lockutils [req-d246c1a9-c9a3-4401-8a94-0bc35db2b2cd req-fb4dd47a-5f72-483c-9e2e-a15e865cff8a 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Releasing lock "refresh_cache-7ca81dd2-d692-41ed-99b0-3046f49353ac" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Jan 23 10:26:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:55.268 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:56 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:26:56.003 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.295 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-unplugged-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.296 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.296 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.297 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.297 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] No waiting events found dispatching network-vif-unplugged-a8ceb3e7-8c43-461e-b444-6492e841b540 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.297 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-unplugged-a8ceb3e7-8c43-461e-b444-6492e841b540 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.297 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.298 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Acquiring lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.298 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.298 225709 DEBUG oslo_concurrency.lockutils [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.298 225709 DEBUG nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] No waiting events found dispatching network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.299 225709 WARNING nova.compute.manager [req-583b26fe-b92b-4768-8a50-0969c12d807c req-8bd27e76-3f40-40a1-aaa1-bd4bd6824805 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received unexpected event network-vif-plugged-a8ceb3e7-8c43-461e-b444-6492e841b540 for instance with vm_state active and task_state deleting.
Jan 23 10:26:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:26:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:56.745 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:26:56 compute-1 nova_compute[225705]: 2026-01-23 10:26:56.850 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:26:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:26:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:26:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:26:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:26:57 compute-1 ceph-mon[80126]: pgmap v1012: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 7.2 KiB/s rd, 17 KiB/s wr, 2 op/s
Jan 23 10:26:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:57.271 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:58 compute-1 ceph-mon[80126]: pgmap v1013: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 13 KiB/s rd, 19 KiB/s wr, 8 op/s
Jan 23 10:26:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:26:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:26:58.750 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:26:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:26:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:26:59.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:26:59 compute-1 nova_compute[225705]: 2026-01-23 10:26:59.287 225709 INFO nova.virt.libvirt.driver [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Deleting instance files /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac_del
Jan 23 10:26:59 compute-1 nova_compute[225705]: 2026-01-23 10:26:59.288 225709 INFO nova.virt.libvirt.driver [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Deletion of /var/lib/nova/instances/7ca81dd2-d692-41ed-99b0-3046f49353ac_del complete
Jan 23 10:26:59 compute-1 nova_compute[225705]: 2026-01-23 10:26:59.510 225709 INFO nova.compute.manager [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Took 6.12 seconds to destroy the instance on the hypervisor.
Jan 23 10:26:59 compute-1 nova_compute[225705]: 2026-01-23 10:26:59.511 225709 DEBUG oslo.service.loopingcall [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Jan 23 10:26:59 compute-1 nova_compute[225705]: 2026-01-23 10:26:59.511 225709 DEBUG nova.compute.manager [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Jan 23 10:26:59 compute-1 nova_compute[225705]: 2026-01-23 10:26:59.511 225709 DEBUG nova.network.neutron [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Jan 23 10:26:59 compute-1 nova_compute[225705]: 2026-01-23 10:26:59.653 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:26:59 compute-1 podman[238072]: 2026-01-23 10:26:59.69222588 +0000 UTC m=+0.079880538 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Jan 23 10:27:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:00.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.061 225709 DEBUG nova.network.neutron [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:27:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:01.279 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.301 225709 DEBUG nova.compute.manager [req-dffdcd5a-b4c7-4fac-8cb2-0ee0db6e78d3 req-6e08f62a-9c1a-4135-b60a-73df7ab4b14e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Received event network-vif-deleted-a8ceb3e7-8c43-461e-b444-6492e841b540 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.302 225709 INFO nova.compute.manager [req-dffdcd5a-b4c7-4fac-8cb2-0ee0db6e78d3 req-6e08f62a-9c1a-4135-b60a-73df7ab4b14e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Neutron deleted interface a8ceb3e7-8c43-461e-b444-6492e841b540; detaching it from the instance and deleting it from the info cache
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.302 225709 DEBUG nova.network.neutron [req-dffdcd5a-b4c7-4fac-8cb2-0ee0db6e78d3 req-6e08f62a-9c1a-4135-b60a-73df7ab4b14e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Jan 23 10:27:01 compute-1 ceph-mon[80126]: pgmap v1014: 353 pgs: 353 active+clean; 200 MiB data, 361 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 6.7 KiB/s wr, 8 op/s
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.335 225709 INFO nova.compute.manager [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Took 1.82 seconds to deallocate network for instance.
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.342 225709 DEBUG nova.compute.manager [req-dffdcd5a-b4c7-4fac-8cb2-0ee0db6e78d3 req-6e08f62a-9c1a-4135-b60a-73df7ab4b14e 56a5c2dc076c4e2489c82e9feac864fb 3b334319b2184689ac0dd92f207d57b0 - - default default] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Detach interface failed, port_id=a8ceb3e7-8c43-461e-b444-6492e841b540, reason: Instance 7ca81dd2-d692-41ed-99b0-3046f49353ac could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.471 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.471 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.596 225709 DEBUG oslo_concurrency.processutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:27:01 compute-1 nova_compute[225705]: 2026-01-23 10:27:01.852 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:27:02 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/138703590' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:02 compute-1 nova_compute[225705]: 2026-01-23 10:27:02.075 225709 DEBUG oslo_concurrency.processutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.479s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:27:02 compute-1 nova_compute[225705]: 2026-01-23 10:27:02.082 225709 DEBUG nova.compute.provider_tree [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:27:02 compute-1 nova_compute[225705]: 2026-01-23 10:27:02.576 225709 DEBUG nova.scheduler.client.report [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:27:02 compute-1 nova_compute[225705]: 2026-01-23 10:27:02.740 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:02.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:02 compute-1 sshd-session[238115]: Invalid user sol from 45.148.10.240 port 42042
Jan 23 10:27:02 compute-1 nova_compute[225705]: 2026-01-23 10:27:02.879 225709 INFO nova.scheduler.client.report [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Deleted allocations for instance 7ca81dd2-d692-41ed-99b0-3046f49353ac
Jan 23 10:27:02 compute-1 sshd-session[238115]: Connection closed by invalid user sol 45.148.10.240 port 42042 [preauth]
Jan 23 10:27:03 compute-1 ceph-mon[80126]: pgmap v1015: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 26 KiB/s rd, 17 KiB/s wr, 31 op/s
Jan 23 10:27:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/138703590' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:03.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:04 compute-1 nova_compute[225705]: 2026-01-23 10:27:04.263 225709 DEBUG oslo_concurrency.lockutils [None req-8c61ffd0-6ac7-4575-a326-f99fb856d837 f459c4e71e6c47acb0f8aaf83f34695e acc90003f0f7412b8daf8a1b6f0f1494 - - default default] Lock "7ca81dd2-d692-41ed-99b0-3046f49353ac" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 10.881s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:04 compute-1 nova_compute[225705]: 2026-01-23 10:27:04.656 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:27:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:04.758 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:27:04 compute-1 ceph-mon[80126]: pgmap v1016: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 29 op/s
Jan 23 10:27:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:05.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:06 compute-1 sudo[238119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:27:06 compute-1 sudo[238119]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:06 compute-1 sudo[238119]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:27:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:06.762 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:06 compute-1 nova_compute[225705]: 2026-01-23 10:27:06.856 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:06 compute-1 nova_compute[225705]: 2026-01-23 10:27:06.897 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:06 compute-1 nova_compute[225705]: 2026-01-23 10:27:06.897 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:27:06 compute-1 nova_compute[225705]: 2026-01-23 10:27:06.898 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:27:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:27:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:07.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:27:07 compute-1 nova_compute[225705]: 2026-01-23 10:27:07.446 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:27:08 compute-1 ceph-mon[80126]: pgmap v1017: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 13 KiB/s wr, 29 op/s
Jan 23 10:27:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:27:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:08.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:27:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:09.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:09 compute-1 nova_compute[225705]: 2026-01-23 10:27:09.419 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:09 compute-1 nova_compute[225705]: 2026-01-23 10:27:09.618 225709 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769164014.6176653, 7ca81dd2-d692-41ed-99b0-3046f49353ac => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Jan 23 10:27:09 compute-1 nova_compute[225705]: 2026-01-23 10:27:09.619 225709 INFO nova.compute.manager [-] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] VM Stopped (Lifecycle Event)
Jan 23 10:27:09 compute-1 nova_compute[225705]: 2026-01-23 10:27:09.658 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:09 compute-1 nova_compute[225705]: 2026-01-23 10:27:09.986 225709 DEBUG nova.compute.manager [None req-0c708697-ab87-444a-869f-e1d106eb6707 - - - - - -] [instance: 7ca81dd2-d692-41ed-99b0-3046f49353ac] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Jan 23 10:27:10 compute-1 ceph-mon[80126]: pgmap v1018: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 13 KiB/s wr, 30 op/s
Jan 23 10:27:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:10.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:10 compute-1 nova_compute[225705]: 2026-01-23 10:27:10.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:10 compute-1 nova_compute[225705]: 2026-01-23 10:27:10.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:11.300 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:11 compute-1 nova_compute[225705]: 2026-01-23 10:27:11.859 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:11 compute-1 ceph-mon[80126]: pgmap v1019: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 14 KiB/s rd, 11 KiB/s wr, 23 op/s
Jan 23 10:27:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/44430795' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:11 compute-1 nova_compute[225705]: 2026-01-23 10:27:11.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:11 compute-1 nova_compute[225705]: 2026-01-23 10:27:11.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:11 compute-1 nova_compute[225705]: 2026-01-23 10:27:11.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:11 compute-1 nova_compute[225705]: 2026-01-23 10:27:11.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:11 compute-1 nova_compute[225705]: 2026-01-23 10:27:11.902 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:27:11 compute-1 nova_compute[225705]: 2026-01-23 10:27:11.903 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:27:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:27:12 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1204745815' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:12 compute-1 nova_compute[225705]: 2026-01-23 10:27:12.436 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.533s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:27:12 compute-1 nova_compute[225705]: 2026-01-23 10:27:12.614 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:27:12 compute-1 nova_compute[225705]: 2026-01-23 10:27:12.615 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4885MB free_disk=59.94270324707031GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:27:12 compute-1 nova_compute[225705]: 2026-01-23 10:27:12.615 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:12 compute-1 nova_compute[225705]: 2026-01-23 10:27:12.615 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:12 compute-1 nova_compute[225705]: 2026-01-23 10:27:12.684 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:27:12 compute-1 nova_compute[225705]: 2026-01-23 10:27:12.685 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:27:12 compute-1 nova_compute[225705]: 2026-01-23 10:27:12.705 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:27:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:12.772 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:13 compute-1 ceph-mon[80126]: pgmap v1020: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 15 KiB/s rd, 11 KiB/s wr, 24 op/s
Jan 23 10:27:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3962373468' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:27:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1089974770' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:13 compute-1 nova_compute[225705]: 2026-01-23 10:27:13.167 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:27:13 compute-1 nova_compute[225705]: 2026-01-23 10:27:13.173 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:27:13 compute-1 nova_compute[225705]: 2026-01-23 10:27:13.189 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:27:13 compute-1 nova_compute[225705]: 2026-01-23 10:27:13.212 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:27:13 compute-1 nova_compute[225705]: 2026-01-23 10:27:13.213 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.598s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:13.303 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1204745815' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1089974770' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/506360951' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:14 compute-1 nova_compute[225705]: 2026-01-23 10:27:14.213 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:14 compute-1 nova_compute[225705]: 2026-01-23 10:27:14.214 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:14 compute-1 nova_compute[225705]: 2026-01-23 10:27:14.214 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:14 compute-1 nova_compute[225705]: 2026-01-23 10:27:14.215 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:27:14 compute-1 nova_compute[225705]: 2026-01-23 10:27:14.695 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:14.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:14 compute-1 podman[238193]: 2026-01-23 10:27:14.778446534 +0000 UTC m=+0.163231422 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Jan 23 10:27:15 compute-1 ceph-mon[80126]: pgmap v1021: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:27:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3948964576' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:15.306 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:15 compute-1 nova_compute[225705]: 2026-01-23 10:27:15.870 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:15 compute-1 nova_compute[225705]: 2026-01-23 10:27:15.888 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:27:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:16.779 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:16 compute-1 nova_compute[225705]: 2026-01-23 10:27:16.861 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:27:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:17.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:27:17 compute-1 ceph-mon[80126]: pgmap v1022: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 682 B/s rd, 0 B/s wr, 0 op/s
Jan 23 10:27:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:18.782 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:18 compute-1 ceph-mon[80126]: pgmap v1023: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 852 B/s wr, 26 op/s
Jan 23 10:27:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:19.314 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:19 compute-1 nova_compute[225705]: 2026-01-23 10:27:19.698 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:20 compute-1 sudo[238222]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:27:20 compute-1 sudo[238222]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:20 compute-1 sudo[238222]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:20 compute-1 sudo[238247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:27:20 compute-1 sudo[238247]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:20.785 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:20 compute-1 sudo[238247]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:21 compute-1 ceph-mon[80126]: pgmap v1024: 353 pgs: 353 active+clean; 41 MiB data, 268 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 852 B/s wr, 25 op/s
Jan 23 10:27:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:27:21 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2669587480' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:21.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:21 compute-1 nova_compute[225705]: 2026-01-23 10:27:21.862 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:22.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:22 compute-1 ceph-mon[80126]: pgmap v1025: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:27:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:23.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:24 compute-1 ceph-mon[80126]: pgmap v1026: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:27:24 compute-1 nova_compute[225705]: 2026-01-23 10:27:24.701 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:24.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:25.324 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:25 compute-1 nova_compute[225705]: 2026-01-23 10:27:25.681 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:25 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:25 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:25 compute-1 nova_compute[225705]: 2026-01-23 10:27:25.789 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:26 compute-1 sudo[238307]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:27:26 compute-1 sudo[238307]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:26 compute-1 sudo[238307]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:26 compute-1 ceph-mon[80126]: pgmap v1027: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:27:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:27:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:27:26 compute-1 ceph-mon[80126]: pgmap v1028: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 22 KiB/s rd, 1.3 KiB/s wr, 31 op/s
Jan 23 10:27:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:26 compute-1 ceph-mon[80126]: Health check update: 2 failed cephadm daemon(s) (CEPHADM_FAILED_DAEMON)
Jan 23 10:27:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:27:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:27:26 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:27:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:26.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:26 compute-1 nova_compute[225705]: 2026-01-23 10:27:26.901 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:27.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:28.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:29.332 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:29 compute-1 ceph-mon[80126]: pgmap v1029: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 383 B/s wr, 3 op/s
Jan 23 10:27:29 compute-1 nova_compute[225705]: 2026-01-23 10:27:29.703 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:30 compute-1 ceph-mon[80126]: pgmap v1030: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 2.0 KiB/s rd, 383 B/s wr, 3 op/s
Jan 23 10:27:30 compute-1 podman[238334]: 2026-01-23 10:27:30.677928094 +0000 UTC m=+0.069328577 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:27:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:30.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:31.335 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:31 compute-1 sudo[238356]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:27:31 compute-1 sudo[238356]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:31 compute-1 sudo[238356]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:31 compute-1 nova_compute[225705]: 2026-01-23 10:27:31.905 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:32 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:32 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:27:32 compute-1 ceph-mon[80126]: pgmap v1031: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 574 B/s rd, 0 op/s
Jan 23 10:27:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:32.803 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:33.339 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:34 compute-1 nova_compute[225705]: 2026-01-23 10:27:34.705 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:34.807 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:35 compute-1 ceph-mon[80126]: pgmap v1032: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 574 B/s rd, 0 op/s
Jan 23 10:27:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:27:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:35.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:36.809 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:36 compute-1 nova_compute[225705]: 2026-01-23 10:27:36.938 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:37 compute-1 ceph-mon[80126]: pgmap v1033: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 574 B/s rd, 0 op/s
Jan 23 10:27:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:37.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:38 compute-1 ceph-mon[80126]: pgmap v1034: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:27:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:38.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:39.351 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:39 compute-1 nova_compute[225705]: 2026-01-23 10:27:39.707 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:27:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:40.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:27:40 compute-1 ceph-mon[80126]: pgmap v1035: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:41.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:41 compute-1 nova_compute[225705]: 2026-01-23 10:27:41.943 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:42.820 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:43.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:43 compute-1 ceph-mon[80126]: pgmap v1036: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:27:44 compute-1 nova_compute[225705]: 2026-01-23 10:27:44.710 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:44.824 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:45.363 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:45 compute-1 podman[238388]: 2026-01-23 10:27:45.724304048 +0000 UTC m=+0.114750401 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:27:46 compute-1 ceph-mon[80126]: pgmap v1037: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:46 compute-1 sudo[238415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:27:46 compute-1 sudo[238415]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:27:46 compute-1 sudo[238415]: pam_unix(sudo:session): session closed for user root
Jan 23 10:27:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:46.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:46 compute-1 nova_compute[225705]: 2026-01-23 10:27:46.946 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:47 compute-1 ceph-mon[80126]: pgmap v1038: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:47.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:47 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:27:47.570 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:27:47 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:27:47.571 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:27:47 compute-1 nova_compute[225705]: 2026-01-23 10:27:47.572 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:48.830 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:49 compute-1 ceph-mon[80126]: pgmap v1039: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:27:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2175178822' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:27:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2175178822' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:27:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:49.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:49 compute-1 nova_compute[225705]: 2026-01-23 10:27:49.712 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:50.833 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:27:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:51.374 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:51 compute-1 nova_compute[225705]: 2026-01-23 10:27:51.948 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:52 compute-1 ceph-mon[80126]: pgmap v1040: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:52 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/855952180' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:27:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:52.837 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:53.377 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:53 compute-1 ceph-mon[80126]: pgmap v1041: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:27:54 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:27:54.572 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:27:54 compute-1 nova_compute[225705]: 2026-01-23 10:27:54.716 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:54.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:27:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:27:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:27:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:27:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:27:55.058 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:27:55 compute-1 ceph-mon[80126]: pgmap v1042: 353 pgs: 353 active+clean; 41 MiB data, 267 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:27:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:55.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:56 compute-1 ceph-mon[80126]: pgmap v1043: 353 pgs: 353 active+clean; 54 MiB data, 269 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 822 KiB/s wr, 1 op/s
Jan 23 10:27:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:27:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:56.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:27:56 compute-1 nova_compute[225705]: 2026-01-23 10:27:56.951 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:27:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:27:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:27:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:27:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:27:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:27:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:57.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:27:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:27:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:27:58.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:27:58 compute-1 ceph-mon[80126]: pgmap v1044: 353 pgs: 353 active+clean; 84 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 9.0 KiB/s rd, 1.7 MiB/s wr, 16 op/s
Jan 23 10:27:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:27:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:27:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:27:59.388 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:27:59 compute-1 nova_compute[225705]: 2026-01-23 10:27:59.761 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:00.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:01.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:01 compute-1 podman[238448]: 2026-01-23 10:28:01.680666878 +0000 UTC m=+0.070871575 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:28:01 compute-1 nova_compute[225705]: 2026-01-23 10:28:01.997 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:02.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:03.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2299921761' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:28:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3503106741' entity='client.openstack' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch
Jan 23 10:28:04 compute-1 nova_compute[225705]: 2026-01-23 10:28:04.764 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:04.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:05.399 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:06 compute-1 ceph-mon[80126]: pgmap v1045: 353 pgs: 353 active+clean; 84 MiB data, 273 MiB used, 60 GiB / 60 GiB avail; 8.7 KiB/s rd, 1.7 MiB/s wr, 16 op/s
Jan 23 10:28:06 compute-1 ceph-mon[80126]: pgmap v1046: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:28:06 compute-1 ceph-mon[80126]: pgmap v1047: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 27 op/s
Jan 23 10:28:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:28:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:06.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:06 compute-1 sudo[238468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:28:06 compute-1 sudo[238468]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:06 compute-1 sudo[238468]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:07 compute-1 nova_compute[225705]: 2026-01-23 10:28:07.000 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:07 compute-1 ovn_controller[133293]: 2026-01-23T10:28:07Z|00095|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Jan 23 10:28:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:07.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:07 compute-1 nova_compute[225705]: 2026-01-23 10:28:07.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:07 compute-1 nova_compute[225705]: 2026-01-23 10:28:07.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:28:07 compute-1 nova_compute[225705]: 2026-01-23 10:28:07.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:28:08 compute-1 nova_compute[225705]: 2026-01-23 10:28:08.065 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:28:08 compute-1 ceph-mon[80126]: pgmap v1048: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 18 KiB/s rd, 1.8 MiB/s wr, 28 op/s
Jan 23 10:28:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:08.863 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:09 compute-1 nova_compute[225705]: 2026-01-23 10:28:09.061 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:09 compute-1 ceph-mon[80126]: pgmap v1049: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 21 KiB/s rd, 1005 KiB/s wr, 31 op/s
Jan 23 10:28:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:09.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:09 compute-1 nova_compute[225705]: 2026-01-23 10:28:09.766 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:10 compute-1 ceph-mon[80126]: pgmap v1050: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 55 KiB/s wr, 16 op/s
Jan 23 10:28:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:10.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:11.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:11 compute-1 nova_compute[225705]: 2026-01-23 10:28:11.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:11 compute-1 nova_compute[225705]: 2026-01-23 10:28:11.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:28:11 compute-1 nova_compute[225705]: 2026-01-23 10:28:11.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:28:11 compute-1 nova_compute[225705]: 2026-01-23 10:28:11.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:28:11 compute-1 nova_compute[225705]: 2026-01-23 10:28:11.903 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:28:11 compute-1 nova_compute[225705]: 2026-01-23 10:28:11.904 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:28:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:12 compute-1 nova_compute[225705]: 2026-01-23 10:28:12.004 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:28:12 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/615719513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:12 compute-1 nova_compute[225705]: 2026-01-23 10:28:12.464 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.560s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:28:12 compute-1 nova_compute[225705]: 2026-01-23 10:28:12.618 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:28:12 compute-1 nova_compute[225705]: 2026-01-23 10:28:12.620 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4907MB free_disk=59.96738052368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:28:12 compute-1 nova_compute[225705]: 2026-01-23 10:28:12.620 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:28:12 compute-1 nova_compute[225705]: 2026-01-23 10:28:12.620 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:28:12 compute-1 nova_compute[225705]: 2026-01-23 10:28:12.824 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:28:12 compute-1 nova_compute[225705]: 2026-01-23 10:28:12.825 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:28:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:12.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:12 compute-1 nova_compute[225705]: 2026-01-23 10:28:12.956 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:28:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:13.415 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:28:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3161038119' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:13 compute-1 nova_compute[225705]: 2026-01-23 10:28:13.477 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.522s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:28:13 compute-1 nova_compute[225705]: 2026-01-23 10:28:13.484 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:28:13 compute-1 nova_compute[225705]: 2026-01-23 10:28:13.509 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:28:13 compute-1 nova_compute[225705]: 2026-01-23 10:28:13.512 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:28:13 compute-1 nova_compute[225705]: 2026-01-23 10:28:13.512 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.892s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:28:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:13 compute-1 ceph-mon[80126]: pgmap v1051: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 55 KiB/s wr, 85 op/s
Jan 23 10:28:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/676391413' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/615719513' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:14 compute-1 nova_compute[225705]: 2026-01-23 10:28:14.512 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:14 compute-1 nova_compute[225705]: 2026-01-23 10:28:14.512 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:14 compute-1 nova_compute[225705]: 2026-01-23 10:28:14.513 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:14 compute-1 nova_compute[225705]: 2026-01-23 10:28:14.513 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:14 compute-1 nova_compute[225705]: 2026-01-23 10:28:14.513 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:28:14 compute-1 nova_compute[225705]: 2026-01-23 10:28:14.768 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:14.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:14 compute-1 nova_compute[225705]: 2026-01-23 10:28:14.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:15.417 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:15 compute-1 nova_compute[225705]: 2026-01-23 10:28:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:28:16 compute-1 podman[238543]: 2026-01-23 10:28:16.69485291 +0000 UTC m=+0.097310905 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:28:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3161038119' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:16 compute-1 ceph-mon[80126]: pgmap v1052: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:28:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/494566907' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:16.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:17 compute-1 nova_compute[225705]: 2026-01-23 10:28:17.006 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:17.421 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/563239400' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:18 compute-1 ceph-mon[80126]: pgmap v1053: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 12 KiB/s wr, 74 op/s
Jan 23 10:28:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:18.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:19.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:19 compute-1 nova_compute[225705]: 2026-01-23 10:28:19.771 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:20 compute-1 ceph-mon[80126]: pgmap v1054: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 426 B/s wr, 73 op/s
Jan 23 10:28:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3824772010' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:20.883 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:21.428 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:21 compute-1 ceph-mon[80126]: pgmap v1055: 353 pgs: 353 active+clean; 88 MiB data, 289 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 69 op/s
Jan 23 10:28:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:28:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:22 compute-1 nova_compute[225705]: 2026-01-23 10:28:22.009 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:22 compute-1 systemd[1]: virtsecretd.service: Deactivated successfully.
Jan 23 10:28:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:22.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:23.431 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:23 compute-1 ceph-mon[80126]: pgmap v1056: 353 pgs: 353 active+clean; 92 MiB data, 297 MiB used, 60 GiB / 60 GiB avail; 1.9 MiB/s rd, 827 KiB/s wr, 83 op/s
Jan 23 10:28:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:24 compute-1 nova_compute[225705]: 2026-01-23 10:28:24.813 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:24.889 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:25.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:25 compute-1 ceph-mon[80126]: pgmap v1057: 353 pgs: 353 active+clean; 92 MiB data, 297 MiB used, 60 GiB / 60 GiB avail; 29 KiB/s rd, 827 KiB/s wr, 14 op/s
Jan 23 10:28:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:26.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:26 compute-1 sudo[238573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:28:26 compute-1 sudo[238573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:26 compute-1 sudo[238573]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:27 compute-1 nova_compute[225705]: 2026-01-23 10:28:27.012 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:27 compute-1 ceph-mon[80126]: pgmap v1058: 353 pgs: 353 active+clean; 95 MiB data, 297 MiB used, 60 GiB / 60 GiB avail; 30 KiB/s rd, 1000 KiB/s wr, 15 op/s
Jan 23 10:28:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:27.438 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:28 compute-1 ceph-mon[80126]: pgmap v1059: 353 pgs: 353 active+clean; 103 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 1.6 MiB/s wr, 21 op/s
Jan 23 10:28:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:28.895 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:29.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:29 compute-1 nova_compute[225705]: 2026-01-23 10:28:29.815 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:30.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:31 compute-1 ceph-mon[80126]: pgmap v1060: 353 pgs: 353 active+clean; 103 MiB data, 305 MiB used, 60 GiB / 60 GiB avail; 31 KiB/s rd, 1.6 MiB/s wr, 21 op/s
Jan 23 10:28:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:31.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:31 compute-1 sudo[238601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:28:31 compute-1 sudo[238601]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:31 compute-1 sudo[238601]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:31 compute-1 sudo[238632]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:28:31 compute-1 sudo[238632]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:31 compute-1 podman[238625]: 2026-01-23 10:28:31.912780293 +0000 UTC m=+0.091454056 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:28:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:32 compute-1 nova_compute[225705]: 2026-01-23 10:28:32.015 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:32 compute-1 sudo[238632]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:32 compute-1 ceph-mon[80126]: pgmap v1061: 353 pgs: 353 active+clean; 113 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 186 KiB/s rd, 2.1 MiB/s wr, 47 op/s
Jan 23 10:28:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:32.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:33.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:33 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:28:33 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:28:33 compute-1 ceph-mon[80126]: pgmap v1062: 353 pgs: 353 active+clean; 113 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 178 KiB/s rd, 1.5 MiB/s wr, 37 op/s
Jan 23 10:28:33 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:28:33 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:28:33 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:28:33 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:28:33 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:28:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:34 compute-1 nova_compute[225705]: 2026-01-23 10:28:34.817 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:34.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:35.450 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:28:36 compute-1 ceph-mon[80126]: pgmap v1063: 353 pgs: 353 active+clean; 113 MiB data, 313 MiB used, 60 GiB / 60 GiB avail; 178 KiB/s rd, 1.5 MiB/s wr, 37 op/s
Jan 23 10:28:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:36.907 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:37 compute-1 nova_compute[225705]: 2026-01-23 10:28:37.017 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:37.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:37 compute-1 sudo[238704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:28:37 compute-1 sudo[238704]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:37 compute-1 sudo[238704]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:37 compute-1 ceph-mon[80126]: pgmap v1064: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 246 KiB/s rd, 1.3 MiB/s wr, 50 op/s
Jan 23 10:28:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:28:37 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:28:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:38.911 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:39 compute-1 ceph-mon[80126]: pgmap v1065: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 244 KiB/s rd, 662 KiB/s wr, 43 op/s
Jan 23 10:28:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:39.456 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:39 compute-1 nova_compute[225705]: 2026-01-23 10:28:39.818 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:40.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:41.460 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:41 compute-1 ceph-mon[80126]: pgmap v1066: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 244 KiB/s rd, 661 KiB/s wr, 43 op/s
Jan 23 10:28:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:42 compute-1 nova_compute[225705]: 2026-01-23 10:28:42.021 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:42.917 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:43 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:28:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:43.464 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:44 compute-1 ceph-mon[80126]: pgmap v1067: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 69 KiB/s rd, 59 KiB/s wr, 15 op/s
Jan 23 10:28:44 compute-1 nova_compute[225705]: 2026-01-23 10:28:44.821 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:28:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:44.922 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:28:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:45.470 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:46 compute-1 ceph-mon[80126]: pgmap v1068: 353 pgs: 353 active+clean; 121 MiB data, 314 MiB used, 60 GiB / 60 GiB avail; 60 KiB/s rd, 52 KiB/s wr, 13 op/s
Jan 23 10:28:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:46.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:47 compute-1 nova_compute[225705]: 2026-01-23 10:28:47.021 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:47 compute-1 sudo[238734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:28:47 compute-1 sudo[238734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:28:47 compute-1 sudo[238734]: pam_unix(sudo:session): session closed for user root
Jan 23 10:28:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3848999722' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:28:47 compute-1 podman[238758]: 2026-01-23 10:28:47.207269552 +0000 UTC m=+0.118444654 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Jan 23 10:28:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:47.472 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:48 compute-1 ceph-mon[80126]: pgmap v1069: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 79 KiB/s rd, 53 KiB/s wr, 41 op/s
Jan 23 10:28:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:48.929 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:49.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:49 compute-1 ceph-mon[80126]: pgmap v1070: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:28:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/292376593' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:28:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/292376593' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:28:49 compute-1 nova_compute[225705]: 2026-01-23 10:28:49.823 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:28:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:50.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:28:50.976 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:28:50 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:28:50.977 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:28:50 compute-1 nova_compute[225705]: 2026-01-23 10:28:50.977 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:51.477 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:51 compute-1 ceph-mon[80126]: pgmap v1071: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:28:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:52 compute-1 nova_compute[225705]: 2026-01-23 10:28:52.024 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #61. Immutable memtables: 0.
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.761676) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 35] Flushing memtable with next log file: 61
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132761768, "job": 35, "event": "flush_started", "num_memtables": 1, "num_entries": 2389, "num_deletes": 251, "total_data_size": 6545311, "memory_usage": 6629904, "flush_reason": "Manual Compaction"}
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 35] Level-0 flush table #62: started
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132785331, "cf_name": "default", "job": 35, "event": "table_file_creation", "file_number": 62, "file_size": 4222889, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31129, "largest_seqno": 33513, "table_properties": {"data_size": 4213075, "index_size": 6244, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 19896, "raw_average_key_size": 20, "raw_value_size": 4193741, "raw_average_value_size": 4305, "num_data_blocks": 264, "num_entries": 974, "num_filter_entries": 974, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769163913, "oldest_key_time": 1769163913, "file_creation_time": 1769164132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 62, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 35] Flush lasted 23867 microseconds, and 11703 cpu microseconds.
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.785543) [db/flush_job.cc:967] [default] [JOB 35] Level-0 flush table #62: 4222889 bytes OK
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.785629) [db/memtable_list.cc:519] [default] Level-0 commit table #62 started
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.787620) [db/memtable_list.cc:722] [default] Level-0 commit table #62: memtable #1 done
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.787637) EVENT_LOG_v1 {"time_micros": 1769164132787632, "job": 35, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.787659) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 35] Try to delete WAL files size 6534901, prev total WAL file size 6534901, number of live WAL files 2.
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000058.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.789698) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032353130' seq:72057594037927935, type:22 .. '7061786F730032373632' seq:0, type:0; will stop at (end)
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 36] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 35 Base level 0, inputs: [62(4123KB)], [60(12MB)]
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132789821, "job": 36, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [62], "files_L6": [60], "score": -1, "input_data_size": 16839822, "oldest_snapshot_seqno": -1}
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 36] Generated table #63: 6336 keys, 14607576 bytes, temperature: kUnknown
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132891817, "cf_name": "default", "job": 36, "event": "table_file_creation", "file_number": 63, "file_size": 14607576, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14564997, "index_size": 25637, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 15877, "raw_key_size": 162344, "raw_average_key_size": 25, "raw_value_size": 14450534, "raw_average_value_size": 2280, "num_data_blocks": 1024, "num_entries": 6336, "num_filter_entries": 6336, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164132, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 63, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.892110) [db/compaction/compaction_job.cc:1663] [default] [JOB 36] Compacted 1@0 + 1@6 files to L6 => 14607576 bytes
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.894848) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.9 rd, 143.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.0, 12.0 +0.0 blob) out(13.9 +0.0 blob), read-write-amplify(7.4) write-amplify(3.5) OK, records in: 6854, records dropped: 518 output_compression: NoCompression
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.894900) EVENT_LOG_v1 {"time_micros": 1769164132894879, "job": 36, "event": "compaction_finished", "compaction_time_micros": 102103, "compaction_time_cpu_micros": 30018, "output_level": 6, "num_output_files": 1, "total_output_size": 14607576, "num_input_records": 6854, "num_output_records": 6336, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000062.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132896544, "job": 36, "event": "table_file_deletion", "file_number": 62}
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000060.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164132899508, "job": 36, "event": "table_file_deletion", "file_number": 60}
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.789512) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899690) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899697) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899699) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899701) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:52 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:28:52.899703) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:28:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:52.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:53.480 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:53 compute-1 ceph-mon[80126]: pgmap v1072: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 20 KiB/s rd, 14 KiB/s wr, 28 op/s
Jan 23 10:28:54 compute-1 nova_compute[225705]: 2026-01-23 10:28:54.826 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:54.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:28:55.059 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:28:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:28:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:28:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:28:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:28:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:28:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:55.482 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:28:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:56.942 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:28:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:28:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:28:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:28:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:28:57 compute-1 nova_compute[225705]: 2026-01-23 10:28:57.026 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:28:57 compute-1 ceph-mon[80126]: pgmap v1073: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 27 op/s
Jan 23 10:28:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:57.485 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:58 compute-1 ceph-mon[80126]: pgmap v1074: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 19 KiB/s rd, 1.2 KiB/s wr, 28 op/s
Jan 23 10:28:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:28:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:28:58.945 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:28:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:28:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:28:59.488 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:28:59 compute-1 ceph-mon[80126]: pgmap v1075: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:28:59 compute-1 nova_compute[225705]: 2026-01-23 10:28:59.828 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:00.947 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:00 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:29:00.980 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:29:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:01.491 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:02 compute-1 nova_compute[225705]: 2026-01-23 10:29:02.053 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:02 compute-1 ceph-mon[80126]: pgmap v1076: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:02 compute-1 podman[238793]: 2026-01-23 10:29:02.634927094 +0000 UTC m=+0.044673451 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Jan 23 10:29:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:02.951 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:03.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:03 compute-1 ceph-mon[80126]: pgmap v1077: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:04 compute-1 nova_compute[225705]: 2026-01-23 10:29:04.830 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:04.954 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:05.496 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:06 compute-1 ceph-mon[80126]: pgmap v1078: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:29:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:06.957 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:07 compute-1 nova_compute[225705]: 2026-01-23 10:29:07.056 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:07 compute-1 sudo[238814]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:29:07 compute-1 sudo[238814]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:07 compute-1 sudo[238814]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:07.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:08 compute-1 ceph-mon[80126]: pgmap v1079: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:08.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:09.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:09 compute-1 nova_compute[225705]: 2026-01-23 10:29:09.832 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:09 compute-1 nova_compute[225705]: 2026-01-23 10:29:09.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:09 compute-1 nova_compute[225705]: 2026-01-23 10:29:09.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:29:09 compute-1 nova_compute[225705]: 2026-01-23 10:29:09.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:29:10 compute-1 nova_compute[225705]: 2026-01-23 10:29:10.010 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:29:10 compute-1 ceph-mon[80126]: pgmap v1080: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:10.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:11 compute-1 nova_compute[225705]: 2026-01-23 10:29:11.004 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:11.504 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:12 compute-1 nova_compute[225705]: 2026-01-23 10:29:12.058 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:12 compute-1 ceph-mon[80126]: pgmap v1081: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:12.967 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:13.512 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:13 compute-1 ceph-mon[80126]: pgmap v1082: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2501464116' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3300512477' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:13 compute-1 nova_compute[225705]: 2026-01-23 10:29:13.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:13 compute-1 nova_compute[225705]: 2026-01-23 10:29:13.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:13 compute-1 nova_compute[225705]: 2026-01-23 10:29:13.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:29:13 compute-1 nova_compute[225705]: 2026-01-23 10:29:13.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:29:13 compute-1 nova_compute[225705]: 2026-01-23 10:29:13.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:29:13 compute-1 nova_compute[225705]: 2026-01-23 10:29:13.903 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:29:13 compute-1 nova_compute[225705]: 2026-01-23 10:29:13.904 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:29:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:29:14 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3441369713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.431 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.527s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.631 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.632 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4914MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.633 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.633 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.697 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.697 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.720 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.738 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.739 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.756 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.783 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.802 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:29:14 compute-1 nova_compute[225705]: 2026-01-23 10:29:14.834 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:29:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:14.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:29:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/423661688' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2920842876' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3441369713' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:15 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:29:15 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1986486844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:15 compute-1 nova_compute[225705]: 2026-01-23 10:29:15.292 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:29:15 compute-1 nova_compute[225705]: 2026-01-23 10:29:15.300 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:29:15 compute-1 nova_compute[225705]: 2026-01-23 10:29:15.316 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:29:15 compute-1 nova_compute[225705]: 2026-01-23 10:29:15.317 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:29:15 compute-1 nova_compute[225705]: 2026-01-23 10:29:15.318 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:29:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:15.515 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:16 compute-1 nova_compute[225705]: 2026-01-23 10:29:16.317 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:16 compute-1 nova_compute[225705]: 2026-01-23 10:29:16.318 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:16 compute-1 nova_compute[225705]: 2026-01-23 10:29:16.319 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:16 compute-1 nova_compute[225705]: 2026-01-23 10:29:16.319 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:16 compute-1 nova_compute[225705]: 2026-01-23 10:29:16.319 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:29:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:16.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:17 compute-1 nova_compute[225705]: 2026-01-23 10:29:17.062 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:17 compute-1 ceph-mon[80126]: pgmap v1083: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:17.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:17 compute-1 podman[238889]: 2026-01-23 10:29:17.698346787 +0000 UTC m=+0.105238739 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Jan 23 10:29:17 compute-1 nova_compute[225705]: 2026-01-23 10:29:17.870 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:17 compute-1 nova_compute[225705]: 2026-01-23 10:29:17.892 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:29:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1986486844' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:29:18 compute-1 ceph-mon[80126]: pgmap v1084: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:18.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:19.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:19 compute-1 nova_compute[225705]: 2026-01-23 10:29:19.837 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:20 compute-1 sshd-session[238916]: Invalid user sol from 45.148.10.240 port 55774
Jan 23 10:29:20 compute-1 sshd-session[238916]: Connection closed by invalid user sol 45.148.10.240 port 55774 [preauth]
Jan 23 10:29:20 compute-1 ceph-mon[80126]: pgmap v1085: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:20.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:21.524 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:22 compute-1 nova_compute[225705]: 2026-01-23 10:29:22.063 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:22 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:29:22 compute-1 ceph-mon[80126]: pgmap v1086: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:22.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:23 compute-1 ceph-mon[80126]: pgmap v1087: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:23.527 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:24 compute-1 nova_compute[225705]: 2026-01-23 10:29:24.839 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:24.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:25.530 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:26 compute-1 ceph-mon[80126]: pgmap v1088: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:26.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:27 compute-1 nova_compute[225705]: 2026-01-23 10:29:27.064 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:27 compute-1 sudo[238921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:29:27 compute-1 sudo[238921]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:27 compute-1 sudo[238921]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:27.533 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:28 compute-1 ceph-mon[80126]: pgmap v1089: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:28.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:29 compute-1 ceph-mon[80126]: pgmap v1090: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:29.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:29 compute-1 nova_compute[225705]: 2026-01-23 10:29:29.842 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:30.995 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:31.537 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:31 compute-1 ceph-mon[80126]: pgmap v1091: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:32 compute-1 nova_compute[225705]: 2026-01-23 10:29:32.066 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:32.998 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:33.559 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:33 compute-1 podman[238950]: 2026-01-23 10:29:33.662812567 +0000 UTC m=+0.063520429 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Jan 23 10:29:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:34 compute-1 ceph-mon[80126]: pgmap v1092: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:34 compute-1 nova_compute[225705]: 2026-01-23 10:29:34.843 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:35.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:35.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:35 compute-1 ceph-mon[80126]: pgmap v1093: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:29:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:37.004 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:37 compute-1 nova_compute[225705]: 2026-01-23 10:29:37.068 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:37.564 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:37 compute-1 sudo[238972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:29:37 compute-1 sudo[238972]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:37 compute-1 sudo[238972]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:37 compute-1 sudo[238997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:29:37 compute-1 sudo[238997]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:38 compute-1 ceph-mon[80126]: pgmap v1094: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:38 compute-1 sudo[238997]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:39.007 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:39 compute-1 ceph-mon[80126]: pgmap v1095: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:29:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:29:39 compute-1 ceph-mon[80126]: pgmap v1096: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:29:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:29:39 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:29:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:39.567 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:39 compute-1 nova_compute[225705]: 2026-01-23 10:29:39.845 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:41.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:41.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:42 compute-1 nova_compute[225705]: 2026-01-23 10:29:42.072 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:42 compute-1 ceph-mon[80126]: pgmap v1097: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 875 B/s rd, 0 op/s
Jan 23 10:29:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:43.015 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:43.572 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:43 compute-1 ceph-mon[80126]: pgmap v1098: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:44 compute-1 nova_compute[225705]: 2026-01-23 10:29:44.847 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:44 compute-1 sudo[239056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:29:44 compute-1 sudo[239056]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:44 compute-1 sudo[239056]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:45.017 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:45 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:29:45 compute-1 ceph-mon[80126]: pgmap v1099: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:45.575 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:47.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:47 compute-1 nova_compute[225705]: 2026-01-23 10:29:47.073 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:47 compute-1 sudo[239082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:29:47 compute-1 sudo[239082]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:29:47 compute-1 sudo[239082]: pam_unix(sudo:session): session closed for user root
Jan 23 10:29:47 compute-1 ceph-mon[80126]: pgmap v1100: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:47.577 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Jan 23 10:29:48 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2022542434' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:29:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Jan 23 10:29:48 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.10:0/2022542434' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:29:48 compute-1 podman[239108]: 2026-01-23 10:29:48.711092044 +0000 UTC m=+0.105853907 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:29:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2022542434' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:29:48 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2022542434' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:29:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:49.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:49.581 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:49 compute-1 ceph-mon[80126]: pgmap v1101: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 583 B/s rd, 0 op/s
Jan 23 10:29:49 compute-1 nova_compute[225705]: 2026-01-23 10:29:49.848 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:29:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:51.028 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:51.584 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:29:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2404.3 total, 600.0 interval
                                           Cumulative writes: 13K writes, 49K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s
                                           Cumulative WAL: 13K writes, 4008 syncs, 3.45 writes per sync, written: 0.04 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 3081 writes, 10K keys, 3081 commit groups, 1.0 writes per commit group, ingest: 9.92 MB, 0.02 MB/s
                                           Interval WAL: 3081 writes, 1324 syncs, 2.33 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:29:51 compute-1 ceph-mon[80126]: pgmap v1102: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:52 compute-1 nova_compute[225705]: 2026-01-23 10:29:52.077 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:53.031 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:53.587 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:54 compute-1 ceph-mon[80126]: pgmap v1103: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:54 compute-1 nova_compute[225705]: 2026-01-23 10:29:54.851 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:55.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:29:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:29:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:29:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:29:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:29:55.061 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:29:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:55.590 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:56 compute-1 ceph-mon[80126]: pgmap v1104: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:56 compute-1 sshd-session[239139]: Accepted publickey for zuul from 192.168.122.10 port 56786 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:29:56 compute-1 systemd-logind[807]: New session 55 of user zuul.
Jan 23 10:29:56 compute-1 systemd[1]: Started Session 55 of User zuul.
Jan 23 10:29:56 compute-1 sshd-session[239139]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:29:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:29:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:29:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:29:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:29:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:29:57 compute-1 sudo[239143]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 23 10:29:57 compute-1 sudo[239143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:29:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:57.038 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:57 compute-1 nova_compute[225705]: 2026-01-23 10:29:57.078 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:29:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:57.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:58 compute-1 ceph-mon[80126]: pgmap v1105: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:29:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:29:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:29:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:29:59.041 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:29:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:29:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:29:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:29:59.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:29:59 compute-1 ceph-mon[80126]: pgmap v1106: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:29:59 compute-1 nova_compute[225705]: 2026-01-23 10:29:59.854 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:00 compute-1 ceph-mon[80126]: from='client.25769 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:00 compute-1 ceph-mon[80126]: from='client.16287 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:00 compute-1 ceph-mon[80126]: Health detail: HEALTH_WARN 2 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Jan 23 10:30:00 compute-1 ceph-mon[80126]: [WRN] BLUESTORE_SLOW_OP_ALERT: 2 OSD(s) experiencing slow operations in BlueStore
Jan 23 10:30:00 compute-1 ceph-mon[80126]:      osd.1 observed slow operation indications in BlueStore
Jan 23 10:30:00 compute-1 ceph-mon[80126]:      osd.2 observed slow operation indications in BlueStore
Jan 23 10:30:00 compute-1 ceph-mon[80126]: [WRN] CEPHADM_FAILED_DAEMON: 2 failed cephadm daemon(s)
Jan 23 10:30:00 compute-1 ceph-mon[80126]:     daemon nfs.cephfs.2.0.compute-0.fenqiu on compute-0 is in error state
Jan 23 10:30:00 compute-1 ceph-mon[80126]:     daemon nfs.cephfs.1.0.compute-2.tykohi on compute-2 is in error state
Jan 23 10:30:00 compute-1 ceph-mon[80126]: from='client.25807 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:00 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 10:30:00 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1970715676' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:01.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:01.599 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:01 compute-1 ceph-mon[80126]: from='client.25781 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:01 compute-1 ceph-mon[80126]: from='client.16296 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:01 compute-1 ceph-mon[80126]: from='client.25819 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2737233307' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1970715676' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:01 compute-1 ceph-mon[80126]: pgmap v1107: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2989331378' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:02 compute-1 nova_compute[225705]: 2026-01-23 10:30:02.080 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:03.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:03.601 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:03 compute-1 ceph-mon[80126]: pgmap v1108: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:04 compute-1 podman[239477]: 2026-01-23 10:30:04.663228104 +0000 UTC m=+0.058600598 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Jan 23 10:30:04 compute-1 nova_compute[225705]: 2026-01-23 10:30:04.857 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:05.049 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:05.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:30:06 compute-1 ovs-vsctl[239527]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 10:30:06 compute-1 ceph-mon[80126]: pgmap v1109: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:07.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:07 compute-1 nova_compute[225705]: 2026-01-23 10:30:07.083 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:07 compute-1 sudo[239620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:30:07 compute-1 sudo[239620]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:07 compute-1 sudo[239620]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:07 compute-1 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 10:30:07 compute-1 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 10:30:07 compute-1 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 10:30:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:07.605 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:08 compute-1 ceph-mon[80126]: pgmap v1110: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:08 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: cache status {prefix=cache status} (starting...)
Jan 23 10:30:08 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:08 compute-1 lvm[239886]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 10:30:08 compute-1 lvm[239886]: VG ceph_vg0 finished
Jan 23 10:30:08 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: client ls {prefix=client ls} (starting...)
Jan 23 10:30:08 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:08 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 10:30:08 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 23 10:30:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3709761161' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:09.055 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2265376060' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:09 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3709761161' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:09 compute-1 ceph-mon[80126]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 23 10:30:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2096205967' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:09.608 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 10:30:09 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 23 10:30:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2048205722' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:30:09 compute-1 nova_compute[225705]: 2026-01-23 10:30:09.859 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:10 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 10:30:10 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:10 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 10:30:10 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:10 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 23 10:30:10 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2632100089' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:30:10 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 23 10:30:10 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/604163341' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:30:10 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: ops {prefix=ops} (starting...)
Jan 23 10:30:10 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:10 compute-1 nova_compute[225705]: 2026-01-23 10:30:10.890 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:10 compute-1 ceph-mon[80126]: from='client.16308 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:10 compute-1 ceph-mon[80126]: from='client.25793 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:10 compute-1 ceph-mon[80126]: from='client.16320 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:10 compute-1 ceph-mon[80126]: from='client.25808 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:10 compute-1 ceph-mon[80126]: pgmap v1111: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/858010683' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2096205967' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3912976921' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:30:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2048205722' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:30:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:11.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:11 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 10:30:11 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/447049587' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:11 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: session ls {prefix=session ls} (starting...)
Jan 23 10:30:11 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:30:11 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: status {prefix=status} (starting...)
Jan 23 10:30:11 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 23 10:30:11 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/661403047' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:11.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:11 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 10:30:11 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3873136511' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:11 compute-1 nova_compute[225705]: 2026-01-23 10:30:11.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:11 compute-1 nova_compute[225705]: 2026-01-23 10:30:11.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:30:11 compute-1 nova_compute[225705]: 2026-01-23 10:30:11.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:30:11 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 23 10:30:11 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2326913399' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:12 compute-1 nova_compute[225705]: 2026-01-23 10:30:12.085 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 10:30:12 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1159080727' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 10:30:12 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/728729314' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 10:30:12 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3971325804' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.16332 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.25823 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.25843 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.16347 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.25841 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.25855 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1500381436' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2632100089' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1513163500' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1236970046' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/604163341' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.25867 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.25862 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.16383 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/824697158' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1085471352' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.25882 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.25883 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: pgmap v1112: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/447049587' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3358390859' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/661403047' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/621193792' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2288603447' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3873136511' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2183856151' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:12 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 23 10:30:12 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/591404194' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:30:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:13.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 23 10:30:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1309114209' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 23 10:30:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2372642057' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:30:13 compute-1 nova_compute[225705]: 2026-01-23 10:30:13.572 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:30:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:13.613 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.16395 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2326913399' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1159080727' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2427895171' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2342214168' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2631441634' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.25937 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/728729314' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3971325804' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/591404194' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: pgmap v1113: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1820403701' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1489607522' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1309114209' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2510791750' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2372642057' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:30:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 10:30:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2049800932' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:13 compute-1 nova_compute[225705]: 2026-01-23 10:30:13.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 10:30:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/600541726' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 10:30:14 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2345853878' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:14 compute-1 nova_compute[225705]: 2026-01-23 10:30:14.861 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:14 compute-1 nova_compute[225705]: 2026-01-23 10:30:14.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:45.969799+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81485824 unmapped: 1310720 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:46.969955+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81494016 unmapped: 1302528 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:47.970090+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81494016 unmapped: 1302528 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 978492 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:48.970219+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:49.970371+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:50.970549+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81518592 unmapped: 1277952 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:51.970710+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81518592 unmapped: 1277952 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:52.970972+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81518592 unmapped: 1277952 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980004 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:53.971120+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:54.971276+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.178896904s of 10.185409546s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:55.971478+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:56.971695+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:57.971809+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979413 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:58.971991+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:57:59.972204+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:00.972366+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:01.972547+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:02.972745+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:03.973022+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:04.973288+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:05.973631+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81567744 unmapped: 1228800 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:06.973822+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81567744 unmapped: 1228800 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:07.974000+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81575936 unmapped: 1220608 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:08.974164+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81575936 unmapped: 1220608 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:09.974317+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:10.974524+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:11.974735+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:12.974897+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:13.975057+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:14.975216+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:15.975398+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81608704 unmapped: 1187840 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:16.975577+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81608704 unmapped: 1187840 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:17.975764+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81608704 unmapped: 1187840 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:18.975922+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81616896 unmapped: 1179648 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:19.976109+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81616896 unmapped: 1179648 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:20.976280+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:21.976446+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:22.976592+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a55f919e00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61c00 session 0x55a563070d20
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:23.976795+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:24.976937+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:25.977118+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:26.977300+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:27.977581+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81485824 unmapped: 1310720 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:28.977729+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979281 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81494016 unmapped: 1302528 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:29.977908+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81494016 unmapped: 1302528 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:30.978101+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:31.978310+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81502208 unmapped: 1294336 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:32.978657+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81510400 unmapped: 1286144 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 38.248489380s of 38.295757294s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:33.978869+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979413 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81518592 unmapped: 1277952 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:34.979121+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:35.979390+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:36.979562+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81526784 unmapped: 1269760 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:37.979702+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:38.979935+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 979413 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81534976 unmapped: 1261568 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:39.980190+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022c000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:40.980476+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:41.980771+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81543168 unmapped: 1253376 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:42.980983+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:43.981136+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980925 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81551360 unmapped: 1245184 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:44.981419+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:45.981770+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.098414421s of 12.469996452s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:46.981936+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81559552 unmapped: 1236992 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:47.982103+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81567744 unmapped: 1228800 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:48.982304+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980334 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81567744 unmapped: 1228800 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:49.982758+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81575936 unmapped: 1220608 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:50.983019+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81575936 unmapped: 1220608 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:51.983285+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:52.983422+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81584128 unmapped: 1212416 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:53.983626+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:54.983799+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:55.983996+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:56.984227+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81592320 unmapped: 1204224 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:57.984399+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81600512 unmapped: 1196032 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:58.984580+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81600512 unmapped: 1196032 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:58:59.984851+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81608704 unmapped: 1187840 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.25927 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:00.985038+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.16452 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81616896 unmapped: 1179648 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/261526027' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2049800932' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:01.985195+0000)
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.25936 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/600541726' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1794080162' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4156927190' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:02.985326+0000)
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3699757616' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022c000 session 0x55a562ed3860
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4136571834' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a563070000
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2345853878' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81625088 unmapped: 1171456 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3596812779' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1267146393' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-mon[80126]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:03.985578+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:04.985794+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:05.986021+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81633280 unmapped: 1163264 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:06.986186+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:07.986356+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:08.986521+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81641472 unmapped: 1155072 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:09.986673+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81649664 unmapped: 1146880 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:10.986839+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81649664 unmapped: 1146880 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:11.987021+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81657856 unmapped: 1138688 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:12.987173+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81657856 unmapped: 1138688 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:13.987327+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980202 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81666048 unmapped: 1130496 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.839187622s of 28.427438736s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:14.987477+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81666048 unmapped: 1130496 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:15.987655+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81666048 unmapped: 1130496 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:16.987806+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81674240 unmapped: 1122304 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:17.987985+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64c00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81690624 unmapped: 1105920 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:18.988192+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981846 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81690624 unmapped: 1105920 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:19.988365+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81690624 unmapped: 1105920 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:20.988563+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64c00 session 0x55a5602192c0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d65000 session 0x55a562e5ef00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81698816 unmapped: 1097728 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Cumulative writes: 8279 writes, 33K keys, 8279 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.04 MB/s
                                           Cumulative WAL: 8279 writes, 1593 syncs, 5.20 writes per sync, written: 0.02 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 8279 writes, 33K keys, 8279 commit groups, 1.0 writes per commit group, ingest: 21.31 MB, 0.04 MB/s
                                           Interval WAL: 8279 writes, 1593 syncs, 5.20 writes per sync, written: 0.02 GB, 0.04 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 604.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 6.2e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:21.988741+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81764352 unmapped: 1032192 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:22.988917+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81772544 unmapped: 1024000 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:23.989119+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981846 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81772544 unmapped: 1024000 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:24.989343+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81772544 unmapped: 1024000 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:25.989624+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81788928 unmapped: 1007616 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:26.989787+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.093015671s of 12.653417587s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81788928 unmapped: 1007616 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:27.989924+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81797120 unmapped: 999424 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:28.990160+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981123 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81797120 unmapped: 999424 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:29.990414+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:30.990701+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81805312 unmapped: 991232 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:31.990918+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d43400
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:32.991105+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81813504 unmapped: 983040 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:33.991246+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981255 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81821696 unmapped: 974848 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:34.991381+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81854464 unmapped: 942080 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:35.991578+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81854464 unmapped: 942080 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:36.991732+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:37.991919+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81862656 unmapped: 933888 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:38.992049+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981255 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81870848 unmapped: 925696 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:39.992221+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:40.992482+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81903616 unmapped: 892928 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:41.992774+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81911808 unmapped: 884736 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:42.993009+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81911808 unmapped: 884736 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:43.993199+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 981255 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 17.302114487s of 17.311687469s, submitted: 3
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:44.993337+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:45.993608+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81920000 unmapped: 876544 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:46.993803+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81928192 unmapped: 868352 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:47.994046+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81928192 unmapped: 868352 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:48.994213+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81936384 unmapped: 860160 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:49.994379+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81936384 unmapped: 860160 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:50.994522+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81944576 unmapped: 851968 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:51.994713+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81944576 unmapped: 851968 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:52.994852+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 843776 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:53.995082+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 843776 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:54.995214+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81952768 unmapped: 843776 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:55.995397+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81960960 unmapped: 835584 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:56.995551+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81960960 unmapped: 835584 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:57.995693+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81969152 unmapped: 827392 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:58.995841+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81969152 unmapped: 827392 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T09:59:59.996013+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81969152 unmapped: 827392 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:00.996169+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:01.996381+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81977344 unmapped: 819200 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:02.996662+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81985536 unmapped: 811008 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:03.996842+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81985536 unmapped: 811008 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:04.996989+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81993728 unmapped: 802816 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:05.997141+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81993728 unmapped: 802816 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:06.997278+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81993728 unmapped: 802816 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:07.997424+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 81993728 unmapped: 802816 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:08.997654+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:09.997862+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82001920 unmapped: 794624 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:10.998103+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:11.998281+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82010112 unmapped: 786432 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:12.998418+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82018304 unmapped: 778240 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:13.998562+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82018304 unmapped: 778240 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:14.998739+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread fragmentation_score=0.000033 took=0.000313s
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:15.998933+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82026496 unmapped: 770048 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:16.999013+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 761856 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:17.999094+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:18.999271+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 761856 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:19.999467+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82034688 unmapped: 761856 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:20.999684+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82042880 unmapped: 753664 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:21.999771+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82042880 unmapped: 753664 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:22.999904+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82051072 unmapped: 745472 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:24.000020+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82051072 unmapped: 745472 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:25.000160+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 737280 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:26.000356+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 737280 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:27.000481+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82059264 unmapped: 737280 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:28.000701+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 729088 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:29.000843+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82067456 unmapped: 729088 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:30.001033+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:31.001177+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82075648 unmapped: 720896 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:32.001334+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:33.001549+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:34.001728+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82083840 unmapped: 712704 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:35.001860+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 704512 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:36.002056+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82092032 unmapped: 704512 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:37.002241+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 696320 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:38.003875+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 696320 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:39.004028+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82100224 unmapped: 696320 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:40.004171+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 688128 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:41.004345+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82108416 unmapped: 688128 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:42.004558+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82116608 unmapped: 679936 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:43.004730+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82116608 unmapped: 679936 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:44.004893+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 671744 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:45.005037+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 671744 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:46.005225+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82124800 unmapped: 671744 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:47.005409+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 663552 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:48.005581+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82132992 unmapped: 663552 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:49.005791+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 655360 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64000 session 0x55a56017a3c0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d43400 session 0x55a56327a3c0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:50.006011+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 655360 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:51.006143+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82141184 unmapped: 655360 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:52.006283+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 647168 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:53.006379+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82149376 unmapped: 647168 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:54.006522+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 638976 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:55.006717+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 638976 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:56.006977+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82157568 unmapped: 638976 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:57.007151+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 630784 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:58.007371+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82165760 unmapped: 630784 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:00:59.007525+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 622592 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 980532 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:00.007680+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 622592 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d43400
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 75.745262146s of 75.806076050s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:01.007849+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82173952 unmapped: 622592 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:02.008018+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 614400 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:03.008229+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82182144 unmapped: 614400 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:04.008445+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 606208 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 982176 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:05.008642+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82190336 unmapped: 606208 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:06.008925+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 598016 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:07.009152+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82198528 unmapped: 598016 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:08.009379+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 589824 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:09.009560+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 589824 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983688 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:10.009697+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 589824 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:11.009949+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 581632 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:12.010204+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82214912 unmapped: 581632 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:13.010408+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 573440 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:14.010595+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82223104 unmapped: 573440 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983688 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:15.010766+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82231296 unmapped: 565248 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:16.011040+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82231296 unmapped: 565248 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:17.011298+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 557056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.955039978s of 17.089204788s, submitted: 3
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:18.011478+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 557056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:19.011714+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82239488 unmapped: 557056 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983556 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:20.011976+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82247680 unmapped: 548864 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:21.012151+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82255872 unmapped: 540672 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:22.012377+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:23.012580+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:24.012751+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983556 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:25.012887+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:26.013250+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:27.013434+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d67400 session 0x55a56327a000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a563071860
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:28.013651+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:29.013831+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983556 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:30.014127+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:31.014353+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:32.014656+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:33.014917+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:34.015157+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82264064 unmapped: 532480 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983556 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:35.015304+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:36.015460+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:37.015670+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:38.015854+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022c000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 20.438745499s of 20.442741394s, submitted: 1
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:39.015987+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983688 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:40.016144+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:41.016341+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:42.016486+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:43.016651+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:44.016841+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985200 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:45.017026+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:46.017335+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:47.017516+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:48.017805+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:49.017954+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984609 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:50.018108+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82272256 unmapped: 524288 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.390316963s of 12.403103828s, submitted: 3
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:51.018349+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 516096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:52.018556+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 516096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:53.018719+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 516096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:54.018919+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82280448 unmapped: 516096 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983886 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:55.019118+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:56.019355+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:57.019529+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:58.019856+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:01:59.020016+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82288640 unmapped: 507904 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983886 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:00.020208+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:01.020452+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:02.020847+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:03.021202+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a56103d680
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d43400 session 0x55a56327b860
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:04.021335+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fca04000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x2fdf9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.533004761s of 13.541749001s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82296832 unmapped: 499712 heap: 82796544 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983973 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:05.021443+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82206720 unmapped: 1638400 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:06.021571+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82247680 unmapped: 1597440 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:07.021677+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82370560 unmapped: 1474560 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:08.021784+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:09.021929+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983886 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:10.021986+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:11.022217+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,1])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:12.022392+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:13.022629+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:14.022921+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64c00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 7.916155815s of 10.561954498s, submitted: 354
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984018 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:15.023135+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:16.023376+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:17.023611+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:18.023858+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:19.023985+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:20.024202+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984018 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:21.024442+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:22.024619+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:23.024805+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:24.024956+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:25.025206+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983427 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:26.025696+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:27.026005+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:28.026527+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:29.027070+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:30.027460+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983427 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.676980972s of 15.957923889s, submitted: 16
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:31.027668+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:32.027846+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:33.028170+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:34.028347+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:35.028711+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:36.029120+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:37.029419+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:38.029676+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:39.029967+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:40.030128+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:41.030585+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:42.030882+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:43.031033+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:44.031185+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:45.031362+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:46.031663+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:47.031831+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:48.032049+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:49.032176+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:50.032400+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:51.032620+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:52.032848+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:53.033045+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:54.033198+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:55.033356+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:56.033571+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:57.033772+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:58.033938+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:02:59.034204+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:00.034387+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:01.034566+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:02.034701+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:03.035048+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:04.035240+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:05.035386+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:06.035567+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:07.035719+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:08.035875+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:09.036030+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:10.036150+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:11.036447+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:12.036602+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:13.036745+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:14.036894+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:15.037054+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:16.037268+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:17.037421+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:18.037578+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d66800 session 0x55a562ed21e0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d56000 session 0x55a560eef4a0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:19.037753+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:20.037914+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:21.038068+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:22.038222+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:23.038362+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:24.038590+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022c000 session 0x55a562e565a0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64000 session 0x55a56327be00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:25.038832+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:26.039078+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:27.039227+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:28.039393+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:29.039559+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 59.252044678s of 59.258327484s, submitted: 1
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:30.039732+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983427 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:31.039948+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:32.040077+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:33.040265+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:34.040430+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:35.040559+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983559 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:36.040780+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:37.041009+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:38.041190+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:39.041384+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:40.041979+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 983559 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:41.042163+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82534400 unmapped: 1310720 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:42.042324+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d43400
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.363295555s of 12.371066093s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:43.042532+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:44.042716+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:45.042889+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985071 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:46.043125+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:47.043274+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:48.043422+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:49.043652+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:50.043803+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:51.043990+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:52.044159+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:53.044328+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:54.044561+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:55.044723+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:56.044957+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:57.045163+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:58.045395+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:03:59.045587+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:00.046149+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:01.046381+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:02.046894+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:03.047160+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:04.047403+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:05.047627+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:06.047893+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:07.048071+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:08.048262+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:09.048452+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:10.048718+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:11.048932+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:12.049079+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:13.049240+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:14.049416+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:15.049624+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:16.049819+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:17.049942+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:18.050106+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:19.050272+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:20.050452+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:21.050558+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:22.050692+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:23.050876+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:24.051038+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:25.051183+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:26.051347+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:27.051560+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:28.051729+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:29.051892+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:30.052070+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:31.052230+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:32.052398+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:33.052555+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:34.052732+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:35.052896+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:36.053066+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:37.053205+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:38.053346+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:39.053490+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:40.053710+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:41.053877+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:42.054099+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:43.054384+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a5601c9c20
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:44.054589+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:45.054913+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:46.055198+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:47.055391+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:48.055569+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:49.055725+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:50.055924+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984807 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:51.056071+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82542592 unmapped: 1302528 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:52.056360+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:53.056522+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 71.638946533s of 71.669204712s, submitted: 3
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:54.056673+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:55.056881+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 984939 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:56.057069+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:57.057214+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:58.057366+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:04:59.057562+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:00.057746+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986451 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:01.058695+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:02.058854+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:03.059039+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:04.059293+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:05.059456+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985860 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:06.059635+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:07.059776+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:08.060103+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:09.060232+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82550784 unmapped: 1294336 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.976454735s of 15.996831894s, submitted: 3
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:10.060376+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:11.060517+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82567168 unmapped: 1277952 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:12.060658+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82567168 unmapped: 1277952 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:13.060788+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:14.060940+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:15.061077+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:16.061255+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:17.061429+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:18.061612+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:19.061811+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:20.061933+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:21.062113+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:22.062697+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:23.062853+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:24.062999+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:25.063161+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:26.063343+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:27.063480+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d43400 session 0x55a5601ca960
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64c00 session 0x55a55f919a40
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:28.063631+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:29.063766+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:30.063959+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:31.064157+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:32.064330+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:33.064543+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:34.064679+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:35.064820+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985728 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:36.065019+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:37.065208+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:38.069195+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64c00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 28.710205078s of 28.833591461s, submitted: 1
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:39.069621+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:40.069828+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 985860 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:41.070055+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:42.070225+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:43.070518+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:44.070917+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022c000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:45.071471+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987372 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:46.071855+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:47.072069+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:48.072240+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:49.072397+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:50.072561+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.073184013s of 12.080324173s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986781 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:51.072739+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:52.072890+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:53.073102+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:54.073531+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:55.073676+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:56.073886+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:57.074032+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:58.074285+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:05:59.074444+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:00.074555+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:01.074771+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:02.074947+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:03.075110+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:04.075270+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:05.075453+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:06.075711+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:07.075874+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:08.076088+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:09.076222+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:10.076414+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:11.076566+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:12.076752+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:13.076898+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:14.077059+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:15.077206+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a562ed32c0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:16.077389+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:17.077570+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a56327a000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d67400 session 0x55a560ef7c20
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:18.077726+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:19.077910+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:20.078042+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:21.078197+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:22.078365+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:23.078503+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:24.118247+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:25.118393+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986649 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:26.118565+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d43400
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 35.307456970s of 35.356937408s, submitted: 2
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:27.118699+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:28.118821+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:29.118956+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:30.119074+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 986913 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:31.119211+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:32.119363+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:33.119563+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:34.119785+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:35.119941+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 988425 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:36.120198+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:37.120354+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:38.120664+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:39.120824+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:40.120960+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.996991158s of 14.014451981s, submitted: 3
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:41.121110+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987702 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:42.121340+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:43.121557+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:44.121723+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:45.121870+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:46.122052+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:47.122227+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:48.122404+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:49.122605+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:50.122741+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:51.122887+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:52.123156+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:53.123304+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 nova_compute[225705]: 2026-01-23 10:30:14.980 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:30:14 compute-1 nova_compute[225705]: 2026-01-23 10:30:14.980 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:30:14 compute-1 nova_compute[225705]: 2026-01-23 10:30:14.980 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:30:14 compute-1 nova_compute[225705]: 2026-01-23 10:30:14.981 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:30:14 compute-1 nova_compute[225705]: 2026-01-23 10:30:14.981 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:54.123533+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:55.123675+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:56.123827+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:57.124012+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:58.124155+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:06:59.124306+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:00.124589+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:01.124734+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:02.124923+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:03.125141+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:04.125288+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:05.125484+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:06.125754+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:07.125912+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:08.126068+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:09.126317+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:10.126529+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:11.126689+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:12.126826+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:13.127060+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:14.127318+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:15.127652+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:16.127842+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:17.127992+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:18.128172+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:19.128422+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:20.128645+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:21.128792+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d43400 session 0x55a560e96780
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:22.129002+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:23.129175+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:24.129406+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:25.129592+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:26.129938+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:27.130113+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:28.130309+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:29.130433+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:30.130769+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:31.130930+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 987570 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:32.131092+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d65000
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 52.138698578s of 52.151603699s, submitted: 3
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:33.131246+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:34.131414+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:35.131649+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:36.131826+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 989214 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:37.131968+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82460672 unmapped: 1384448 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:38.132129+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:39.132287+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:40.132569+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:41.132780+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990726 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82468864 unmapped: 1376256 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:42.132933+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:43.133135+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:44.133282+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:45.133524+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:46.133726+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.526103973s of 13.826724052s, submitted: 4
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:47.134100+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:48.134257+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:49.134483+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:50.135444+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:51.136282+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:52.136717+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:53.136870+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:54.137051+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:55.137191+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:56.137711+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:57.138138+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:58.138599+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:07:59.138750+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:00.139334+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:01.139486+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:02.139696+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:03.139926+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:04.174851+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:05.175133+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:06.202743+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:07.202878+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:08.203327+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:09.203535+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:10.203674+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:11.203888+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:12.204051+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:13.204329+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82477056 unmapped: 1368064 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:14.204463+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:15.204700+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:16.204970+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:17.205105+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:18.205251+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:19.205433+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:20.205588+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:21.205881+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:14 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:14 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:22.206018+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:23.206158+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:24.206450+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:14 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022c000 session 0x55a5637525a0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64c00 session 0x55a5637523c0
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:14 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:25.206674+0000)
Jan 23 10:30:14 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:26.207093+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:27.207477+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:28.207664+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:29.207960+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:30.208154+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:31.208299+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:32.208561+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:33.208698+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:34.209194+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:35.209396+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.582626343s of 49.590423584s, submitted: 1
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:36.209648+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990135 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:37.209820+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:38.210347+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:39.210572+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:40.210765+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:41.210915+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991647 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:42.211055+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:43.211295+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:44.211452+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:45.211584+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:46.211845+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 992568 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:47.212017+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.072285652s of 12.089330673s, submitted: 4
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:48.212177+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:49.212410+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:50.212574+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:51.212723+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56049b000 session 0x55a560b463c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56016f400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:52.212938+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:53.213117+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:54.213315+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:55.213852+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:56.214178+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:57.214415+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:58.214581+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:59.214725+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:00.214882+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:01.215041+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:02.215272+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:03.215424+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:04.215584+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:05.215793+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:06.215995+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:07.216212+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:08.216385+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:09.216641+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:10.216884+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:11.217050+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:12.217291+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:13.217585+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:14.217741+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:15.217879+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:16.218113+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:17.218247+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:18.218437+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:19.218638+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:20.218858+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:21.219060+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Cumulative writes: 9060 writes, 35K keys, 9060 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 9060 writes, 1959 syncs, 4.62 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 781 writes, 1248 keys, 781 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s
                                           Interval WAL: 781 writes, 366 syncs, 2.13 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:22.219235+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:23.219436+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:24.219627+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:25.219840+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:26.220108+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:27.220325+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:28.220599+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:29.220752+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:30.232552+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:31.232718+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:32.232852+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:33.232994+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:34.233339+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:35.233652+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:36.234003+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:37.234185+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:38.234343+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:39.234527+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:40.234726+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:41.234872+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:42.235048+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:43.235190+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:44.235347+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:45.235535+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:46.235747+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:47.235876+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:48.236002+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:49.236131+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:50.236252+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:51.236374+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:52.236561+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:53.236728+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:54.237105+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:55.237323+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:56.237509+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:57.237693+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:58.237870+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:59.238116+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:00.238249+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:01.238389+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:02.238564+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:03.238748+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:04.238897+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:05.239119+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:06.239317+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:07.239572+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:08.239719+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:09.239869+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:10.240015+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022d400 session 0x55a560ef6780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d65000 session 0x55a5601c9c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:11.242549+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:12.242701+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:13.242892+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:14.243197+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:15.243441+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:16.243777+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:17.243990+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:18.244115+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:19.244264+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:20.244401+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:21.244571+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:22.244724+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 95.210830688s of 95.219345093s, submitted: 2
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:23.244976+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:24.245121+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:25.245295+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:26.245547+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995001 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:27.245785+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:28.245944+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:29.246090+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:30.246251+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:31.246592+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:32.246748+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:33.246965+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:34.247139+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:35.247357+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:36.247594+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:37.247800+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:38.247963+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:39.248224+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:40.248401+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:41.248660+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:42.248830+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:43.249063+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:44.249257+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:45.249486+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:46.249747+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:47.249995+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:48.250196+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:49.250449+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:50.250640+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:51.250852+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:52.251060+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:53.251655+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:54.251824+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:55.252050+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:56.252298+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:57.252577+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:58.252766+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:59.253021+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:00.253215+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:01.253481+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:02.253700+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:03.253925+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:04.254115+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:05.254380+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:06.254592+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:07.254800+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:08.255020+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:09.255298+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:10.255577+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:11.255859+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:12.256029+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:13.256212+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:14.256373+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:15.256560+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:16.256822+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:17.257105+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:18.257281+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:19.257471+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:20.257792+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:21.258088+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:22.258274+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:23.258466+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:24.258678+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:25.258877+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:26.259102+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:27.259292+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:28.259536+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:29.259678+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:30.259845+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:31.260016+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:32.260176+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:33.260367+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:34.260598+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:35.260862+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:36.261255+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:37.261581+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:38.261848+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:39.262074+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:40.262246+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:41.262443+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:42.262618+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:43.262775+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:44.262939+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:45.263082+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:46.263265+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:47.263429+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:48.263628+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:49.263812+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:50.264025+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:51.264160+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:52.264379+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a560c23a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a560b46780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:53.264588+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:54.264771+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d56000 session 0x55a56370c000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64000 session 0x55a560223860
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:55.264927+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:56.265155+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:57.265378+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:58.265569+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83722240 unmapped: 122880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:59.265771+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83722240 unmapped: 122880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:00.265949+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:01.266114+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:02.266289+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:03.266450+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 99.385841370s of 100.670951843s, submitted: 7
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:04.266615+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83787776 unmapped: 57344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:05.266810+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,1,0,1])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83861504 unmapped: 1032192 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:06.267172+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83869696 unmapped: 1024000 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:07.267395+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995535 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,3])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:08.267550+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:09.267762+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:10.267932+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84934656 unmapped: 1007616 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:11.268129+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84942848 unmapped: 999424 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:12.268284+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995463 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:13.268427+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 3.510344267s of 10.329211235s, submitted: 399
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:14.268567+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:15.268709+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:16.268870+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:17.268991+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994281 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:18.269131+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:19.269379+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:20.269546+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:21.269752+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:22.269910+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:23.270331+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:24.270543+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:25.270739+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:26.270953+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:27.271166+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:28.271373+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:29.271577+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:30.271743+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:31.271937+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:32.272151+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:33.272369+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:34.272578+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:35.272761+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:36.273019+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:37.273203+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:38.273381+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:39.273569+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:40.273730+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:41.273903+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:42.274115+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:43.274300+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d67400 session 0x55a5602190e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56049b000 session 0x55a562f18960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:44.274432+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:45.274609+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:46.274816+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:47.274992+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:48.275176+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:49.275386+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:50.275575+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:51.275784+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:52.276000+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:53.276260+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.042835236s of 40.052230835s, submitted: 3
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:54.277013+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:55.277230+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:56.277567+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:57.277775+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994149 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:58.277986+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:59.278254+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:00.278470+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d65000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:01.278760+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:02.278996+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995661 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:03.279165+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:04.279431+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:05.279708+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:06.279998+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:07.280241+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995661 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.938447952s of 13.945899963s, submitted: 2
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:08.280427+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 10:30:15 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4293863724' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:09.280683+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:10.280879+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:11.281022+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:12.281179+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:13.281426+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:14.281618+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:15.281802+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:16.281999+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:17.282186+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:18.282594+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:19.282768+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:20.282925+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:21.283078+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:22.283238+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:23.283403+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:24.283544+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:25.283680+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:26.283861+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:27.284055+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:28.284281+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:29.284481+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:30.284746+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:31.284916+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:32.285134+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:33.285391+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:34.285620+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:35.285836+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:36.286101+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:37.286303+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:38.286551+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:39.286787+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:40.286982+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:41.287146+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:42.287792+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:43.290689+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:44.291793+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:45.292469+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:46.293558+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:47.294282+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:48.294575+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:49.295690+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:50.296291+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:51.296889+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:52.297092+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:53.297276+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:54.297601+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:55.297760+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:56.298028+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:57.298236+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:58.298444+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:59.298691+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:00.298863+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:01.299168+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:02.299404+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:03.299585+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:04.299750+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:05.299908+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:06.300111+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:07.300477+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:08.300910+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:09.301320+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:10.301440+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:11.301743+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:12.301936+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:13.302151+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:14.302346+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:15.302531+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:16.302847+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:17.303120+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:18.303406+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:19.303643+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:20.303819+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:21.304081+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:22.304256+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:23.304409+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:24.304597+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:25.304748+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:26.304954+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:27.305252+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:28.305537+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:29.305801+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:30.305976+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:31.306146+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:32.306371+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:33.306581+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:34.306826+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:35.307015+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:36.307244+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:37.307617+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:38.307858+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:39.308165+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:40.308328+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:41.308593+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:42.308874+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:43.309160+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:44.309307+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:45.309581+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:46.309782+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:47.309954+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:48.310145+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:49.310331+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:50.310564+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:51.310797+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:52.310996+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:53.311183+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:54.311367+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:55.311813+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:56.312036+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:57.312151+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:58.312299+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:59.312451+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:00.312603+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:01.312789+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:02.312972+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:03.313200+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:04.313397+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:05.313565+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:06.313828+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:07.314036+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 120.149322510s of 120.153068542s, submitted: 1
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:08.314191+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 999295 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 1671168 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _renew_subs
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:09.314330+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 1622016 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:10.314591+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _renew_subs
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 141 ms_handle_reset con 0x55a56022d800 session 0x55a560b46000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 1622016 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:11.314757+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fc5e9000/0x0/0x4ffc00000, data 0x161cf8/0x221000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:12.314950+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 141 ms_handle_reset con 0x55a56049b000 session 0x55a563595a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:13.315134+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1144302 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb179000/0x0/0x4ffc00000, data 0x15d1cf8/0x1691000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:14.315313+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:15.315467+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:16.316196+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:17.316387+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:18.316696+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:19.316893+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:20.317037+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:21.317241+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:22.317904+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:23.318040+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:24.318728+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:25.318891+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:26.319356+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:27.320615+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:28.321322+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a560b61400 session 0x55a563594000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a55fd2bc00 session 0x55a562f1fc20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:29.321626+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:30.322043+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:31.322181+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:32.322373+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:33.322678+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:34.322827+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:35.323042+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:36.323354+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:37.323530+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:38.323707+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56022d400 session 0x55a56387fa40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:39.323852+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 30.783803940s of 31.204965591s, submitted: 49
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:40.324146+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:41.324388+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:42.324546+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:43.324795+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150286 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:44.324998+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:45.325255+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d65000 session 0x55a560e963c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d64c00 session 0x55a5627854a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:46.325539+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:47.325794+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:48.326080+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150194 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb175000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:49.326268+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56049b000 session 0x55a560a07a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a560b61400 session 0x55a560b472c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d64000 session 0x55a560e912c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:50.326537+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.051367760s of 11.062813759s, submitted: 3
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d67400 session 0x55a560e96f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022c400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56022c400 session 0x55a563594b40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85614592 unmapped: 18161664 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:51.326755+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56049b000 session 0x55a5635941e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _renew_subs
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85639168 unmapped: 18137088 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:52.326957+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _renew_subs
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87097344 unmapped: 16678912 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:53.327137+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a560b61400 session 0x55a562e5e960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d64000 session 0x55a56387e5a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d67400 session 0x55a5637521e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561cd6000 session 0x55a560219e00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1187745 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a56049b000 session 0x55a560e914a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87097344 unmapped: 16678912 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:54.327325+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:55.327565+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:56.327801+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:57.327985+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87138304 unmapped: 16637952 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d64000 session 0x55a56387fc20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:58.328231+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1186624 data_alloc: 218103808 data_used: 303104
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87146496 unmapped: 16629760 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:59.328436+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89415680 unmapped: 14360576 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:00.328607+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89473024 unmapped: 14303232 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.645101547s of 10.825790405s, submitted: 53
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:01.328751+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89497600 unmapped: 14278656 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:02.328953+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89497600 unmapped: 14278656 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:03.329197+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209371 data_alloc: 218103808 data_used: 3031040
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:04.329394+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:05.329559+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:06.329772+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:07.329955+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:08.330117+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209371 data_alloc: 218103808 data_used: 3031040
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:09.330279+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:10.330693+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:11.330871+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.822414398s of 10.054231644s, submitted: 18
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93847552 unmapped: 9928704 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:12.331097+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93847552 unmapped: 9928704 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:13.331271+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa929000/0x0/0x4ffc00000, data 0x1e13078/0x1edb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253507 data_alloc: 218103808 data_used: 3325952
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92520448 unmapped: 11255808 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:14.331475+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92520448 unmapped: 11255808 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:15.331662+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:16.331923+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:17.332129+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:18.332387+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253507 data_alloc: 218103808 data_used: 3325952
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:19.332624+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:20.332907+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:21.333084+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:22.333302+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:23.333601+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:24.333826+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:25.334032+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:26.334281+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:27.334516+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:28.334769+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:29.334987+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:30.335171+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:31.335403+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:32.335647+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:33.335904+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:34.336162+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:35.336331+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:36.336540+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:37.336717+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:38.336905+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3cc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a56327b0e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3dc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3dc00 session 0x55a563595e00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d69400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a562f192c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:39.337081+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92602368 unmapped: 11173888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d69400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.268079758s of 28.147586823s, submitted: 54
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a56021e780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef6d20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:40.337262+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92594176 unmapped: 11182080 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a562785680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3cc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:41.337444+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a55f9194a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3dc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3dc00 session 0x55a562e5fe00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562e5f2c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a5602214a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d69400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a560222f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:42.337677+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:43.337867+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:44.338055+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:45.338226+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:46.338451+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:47.338644+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:48.338872+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:49.339101+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:50.339326+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:51.339624+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:52.339919+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:53.340086+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:54.340263+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93290496 unmapped: 20987904 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:55.340672+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93290496 unmapped: 20987904 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:56.341013+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93298688 unmapped: 20979712 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3cc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.620344162s of 17.155050278s, submitted: 29
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a560b99c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562cc1800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:57.341156+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 94502912 unmapped: 19775488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:58.341318+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100663296 unmapped: 13615104 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413079 data_alloc: 234881024 data_used: 13975552
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:59.341627+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:00.341850+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:01.342114+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:02.342314+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:03.342606+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413079 data_alloc: 234881024 data_used: 13975552
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:04.342848+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:05.343078+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:06.343423+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:07.343663+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:08.343878+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104775680 unmapped: 9502720 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.021077156s of 12.051360130s, submitted: 8
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1439531 data_alloc: 234881024 data_used: 14000128
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:09.344100+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 9658368 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:10.344365+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112254976 unmapped: 8896512 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4e000/0x0/0x4ffc00000, data 0x3655088/0x371e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:11.344622+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:12.344909+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:13.345225+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1517819 data_alloc: 234881024 data_used: 14286848
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:14.345489+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:15.345734+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:16.345975+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4c000/0x0/0x4ffc00000, data 0x3657088/0x3720000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:17.346265+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:18.346485+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112541696 unmapped: 8609792 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.255970955s of 10.148886681s, submitted: 105
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc1800 session 0x55a562e563c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a56370c780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1513215 data_alloc: 234881024 data_used: 14286848
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:19.346650+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4c000/0x0/0x4ffc00000, data 0x3657088/0x3720000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103571456 unmapped: 17580032 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a563595680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:20.346878+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:21.347129+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:22.347361+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:23.347676+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267213 data_alloc: 218103808 data_used: 3334144
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:24.347956+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:25.348224+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f978f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:26.348557+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:27.348742+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:28.348967+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267213 data_alloc: 218103808 data_used: 3334144
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.262123108s of 10.403896332s, submitted: 50
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:29.349172+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560c245a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5602192c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103260160 unmapped: 17891328 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f978f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,1])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a56103c5a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:30.349344+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:31.349583+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:32.349809+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560ef61e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562f1f680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:33.350016+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184030 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:34.350240+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:35.350487+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:36.350810+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:37.351035+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:38.351247+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184030 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:39.351450+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:40.351681+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:41.351938+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:42.352115+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:43.352374+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.069961548s of 14.267497063s, submitted: 33
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184162 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:44.352564+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:45.352807+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:15.064 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:46.353076+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:47.353284+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:48.353567+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184162 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:49.353806+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:50.354128+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:51.354337+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:52.354540+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a55fee6f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5601c9c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d69400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a5601c9680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:53.354811+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.291867256s of 10.369996071s, submitted: 3
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a56327b860
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184631 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560a9fa40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:54.355046+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a55fee7a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:55.355247+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100130816 unmapped: 21020672 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a560219a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3cc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a5630701e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562e56780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:56.355458+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a5630d8f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560ef6780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:57.355659+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5630dad20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aef000/0x0/0x4ffc00000, data 0x1ab6ff3/0x1b7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:58.355838+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54000 session 0x55a56103d4a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1226346 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:59.356087+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560eef2c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:00.356327+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560eefe00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99868672 unmapped: 29679616 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:01.356489+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99893248 unmapped: 29655040 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:02.356671+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:03.356901+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262160 data_alloc: 218103808 data_used: 5316608
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:04.357044+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:05.357174+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:06.357385+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:07.357648+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:08.357865+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262160 data_alloc: 218103808 data_used: 5316608
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:09.358124+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:10.358329+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101187584 unmapped: 28360704 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:11.358622+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101187584 unmapped: 28360704 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:12.358838+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.546638489s of 18.645618439s, submitted: 27
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102637568 unmapped: 26910720 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562e561e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560b990e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:13.358991+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105865216 unmapped: 23683072 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300302 data_alloc: 218103808 data_used: 6565888
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:14.359170+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105865216 unmapped: 23683072 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9635000/0x0/0x4ffc00000, data 0x1f6a003/0x2031000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,10])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:15.359469+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 106921984 unmapped: 22626304 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:16.359770+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 22519808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:17.360012+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 22519808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:18.360230+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310702 data_alloc: 218103808 data_used: 6471680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:19.360425+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:20.360588+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:21.360759+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:22.360961+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:23.361164+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3d000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.670749664s of 11.247441292s, submitted: 64
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:24.361384+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:25.361571+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:26.361791+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:27.361987+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:28.362169+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:29.362394+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:30.362574+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:31.362767+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107118592 unmapped: 22429696 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:32.362924+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107118592 unmapped: 22429696 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:33.363214+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107151360 unmapped: 22396928 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:34.363452+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107151360 unmapped: 22396928 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:35.363625+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.062977791s of 12.069572449s, submitted: 1
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:36.363800+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:37.363963+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:38.364153+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302567 data_alloc: 218103808 data_used: 6483968
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:39.364371+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:40.364571+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560eef0e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:41.364787+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5603c8b40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:42.365006+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:43.365219+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 27033600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:44.365404+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a562f1e5a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:45.365622+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:46.365799+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:47.366019+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:48.366173+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:49.366378+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:50.366682+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:51.367013+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:52.367201+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:53.367426+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:54.367679+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:55.367941+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:56.368240+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:57.368466+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:58.368701+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:59.368942+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:00.369152+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:01.369347+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:02.369590+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:03.369831+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:04.369963+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:05.370184+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:06.370361+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:07.370610+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:08.370820+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:09.370997+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:10.371148+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563070780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5630703c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a563070f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5630705a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:11.371325+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:12.371596+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.487091064s of 36.540157318s, submitted: 34
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560ef61e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef63c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a560ef7e00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c55800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c55800 session 0x55a560ef6960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560ef6000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:13.371776+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:14.372049+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1275342 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:15.372221+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:16.372542+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:17.372719+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:18.372955+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:19.373097+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276816 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562c57c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101695488 unmapped: 31531008 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:20.373255+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102670336 unmapped: 30556160 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:21.373392+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1804.3 total, 600.0 interval
                                           Cumulative writes: 10K writes, 39K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 10K writes, 2684 syncs, 4.00 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1689 writes, 4700 keys, 1689 commit groups, 1.0 writes per commit group, ingest: 4.62 MB, 0.01 MB/s
                                           Interval WAL: 1689 writes, 725 syncs, 2.33 writes per sync, written: 0.00 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:22.373563+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:23.373774+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:24.373926+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350139 data_alloc: 234881024 data_used: 11247616
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:25.374180+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:26.374440+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:27.374617+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:28.374771+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:29.374933+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350139 data_alloc: 234881024 data_used: 11247616
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:30.375142+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:31.375357+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:32.375606+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:33.375740+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:34.375949+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350291 data_alloc: 234881024 data_used: 11251712
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:35.376117+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.624202728s of 23.833007812s, submitted: 21
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:36.376344+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112435200 unmapped: 20791296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:37.376602+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112435200 unmapped: 20791296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:38.376763+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111206400 unmapped: 22020096 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:39.376956+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412875 data_alloc: 234881024 data_used: 11755520
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8cba000/0x0/0x4ffc00000, data 0x28ecfe3/0x29b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,1])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111697920 unmapped: 21528576 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:40.377098+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113246208 unmapped: 19980288 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:41.377247+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113278976 unmapped: 19947520 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:42.377416+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113278976 unmapped: 19947520 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:43.377631+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113319936 unmapped: 19906560 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:44.377794+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1433231 data_alloc: 234881024 data_used: 12496896
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113319936 unmapped: 19906560 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:45.377982+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:46.378188+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:47.378367+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:48.378533+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:49.378703+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1428407 data_alloc: 234881024 data_used: 12496896
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:50.378926+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.531507492s of 14.888542175s, submitted: 102
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:51.379071+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29a3fe3/0x2a69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29a3fe3/0x2a69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:52.379205+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:53.379340+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:54.379506+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1428511 data_alloc: 234881024 data_used: 12496896
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29a4fe3/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:55.379656+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:56.379914+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29a4fe3/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:57.380093+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:58.380283+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120709120 unmapped: 12517376 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:59.380458+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212800 session 0x55a560e90d20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560c22000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd9000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd9000 session 0x55a560b47860
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1484761 data_alloc: 234881024 data_used: 12496896
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560c24000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5630710e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:00.380604+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560b463c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:01.380776+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212800 session 0x55a562785a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:02.380930+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562214800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214800 session 0x55a55fee70e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.321987152s of 11.643644333s, submitted: 14
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a563594960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f84b7000/0x0/0x4ffc00000, data 0x30effe3/0x31b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 19611648 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:03.381076+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 19611648 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:04.381196+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1503794 data_alloc: 234881024 data_used: 14667776
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:05.381340+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:06.381582+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:07.381768+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:08.381933+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:09.382117+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 14409728 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532370 data_alloc: 234881024 data_used: 18919424
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:10.382280+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 14409728 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:11.382450+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:12.382659+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:13.382845+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:14.382967+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532658 data_alloc: 234881024 data_used: 18923520
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.284761429s of 12.309606552s, submitted: 7
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:15.383041+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122421248 unmapped: 10805248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:16.383180+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122421248 unmapped: 10805248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:17.383365+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121954304 unmapped: 11272192 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:18.383597+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:19.383793+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649774 data_alloc: 234881024 data_used: 19283968
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:20.383953+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:21.384064+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:22.384227+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:23.384393+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72cf000/0x0/0x4ffc00000, data 0x3ec7006/0x3f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72cf000/0x0/0x4ffc00000, data 0x3ec7006/0x3f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:24.384620+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1644310 data_alloc: 234881024 data_used: 19283968
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:25.384846+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a55fee7c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a56103c000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a5603c9a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d000 session 0x55a5601c8000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.608616829s of 10.889985085s, submitted: 103
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:26.385056+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5601cbe00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:27.385206+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:28.385415+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87ee000/0x0/0x4ffc00000, data 0x29a7fe3/0x2a6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:29.385610+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442224 data_alloc: 234881024 data_used: 12496896
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a563070960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55fee6f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:30.387571+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:31.387736+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a561048d20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:32.387894+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:33.388045+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:34.388220+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217850 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:35.388355+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:36.388598+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.357207298s of 10.545221329s, submitted: 64
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:37.388779+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:38.388950+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:39.389137+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1219494 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:40.392112+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:41.394785+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:42.395027+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:43.396795+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:44.398246+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221006 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:45.399370+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:46.400321+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:47.400751+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:48.401359+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.268618584s of 12.279949188s, submitted: 3
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:49.401889+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:50.402331+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:51.402723+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:52.402945+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:53.403442+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:54.404145+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:55.404830+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:56.405475+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:57.406105+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:58.406653+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:59.407161+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:00.407575+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:01.408008+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:02.408233+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:03.408427+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a5623ae400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:04.408600+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.385444641s of 16.389841080s, submitted: 1
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:05.408743+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:06.409048+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:07.409283+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5623ae400 session 0x55a560ef7a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5635954a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5630714a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:08.409440+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5630705a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5630703c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:09.409745+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286318 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:10.409900+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:11.410072+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107503616 unmapped: 33071104 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:12.410229+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107511808 unmapped: 33062912 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a560eeed20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:13.412346+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107511808 unmapped: 33062912 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:14.412644+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 106987520 unmapped: 33587200 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1288671 data_alloc: 218103808 data_used: 393216
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:15.412761+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108208128 unmapped: 32366592 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:16.414594+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:17.414796+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:18.414987+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:19.415205+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347647 data_alloc: 218103808 data_used: 8814592
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:20.415359+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:21.415534+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:22.415717+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:23.415858+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a560eef4a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:24.416012+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347647 data_alloc: 218103808 data_used: 8814592
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:25.416181+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.396499634s of 20.359004974s, submitted: 48
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111820800 unmapped: 28753920 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:26.416442+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115318784 unmapped: 25255936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:27.416604+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c5f000/0x0/0x4ffc00000, data 0x2537045/0x25fd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 25124864 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c54000/0x0/0x4ffc00000, data 0x2541045/0x2607000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:28.416791+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:29.416988+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409157 data_alloc: 218103808 data_used: 9031680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:30.417192+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:31.417333+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bd3000/0x0/0x4ffc00000, data 0x25c3045/0x2689000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 27574272 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:32.417624+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 27574272 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:33.417849+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113139712 unmapped: 27435008 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:34.418044+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 27426816 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bb2000/0x0/0x4ffc00000, data 0x25e4045/0x26aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408089 data_alloc: 218103808 data_used: 9035776
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:35.418228+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 27418624 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:36.418571+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 27418624 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bb2000/0x0/0x4ffc00000, data 0x25e4045/0x26aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:37.418754+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.870034218s of 12.169149399s, submitted: 80
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113180672 unmapped: 27394048 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:38.418981+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113180672 unmapped: 27394048 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:39.419192+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411361 data_alloc: 218103808 data_used: 9035776
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:40.419414+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:41.419690+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:42.419884+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:43.420084+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:44.420242+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411659 data_alloc: 218103808 data_used: 9043968
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:45.420384+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:46.420565+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 27279360 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:47.420721+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:48.420871+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:49.421072+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412595 data_alloc: 218103808 data_used: 9043968
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:50.421264+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:51.421464+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.734275818s of 13.774451256s, submitted: 9
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:52.421713+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:53.421944+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55fee7c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a55fee6f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3d800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a561048d20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d62800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d62800 session 0x55a562c57c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114442240 unmapped: 26132480 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:54.422201+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a562c574a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a560a070e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a563071a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3d800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a5603c9680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c55c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c55c00 session 0x55a560ef7c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 25993216 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x26ec06e/0x27b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432887 data_alloc: 218103808 data_used: 9048064
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:55.422424+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 25985024 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:56.422694+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 25960448 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x26ec0a7/0x27b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:57.422864+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 25960448 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:58.423112+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113704960 unmapped: 26869760 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a5603c9c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:59.423326+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 26853376 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1434949 data_alloc: 218103808 data_used: 9048064
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:00.423576+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d66000 session 0x55a560223a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3d800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:01.423757+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x26ed0ca/0x27b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:02.423962+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.734139442s of 11.903190613s, submitted: 57
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:03.424216+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114343936 unmapped: 26230784 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:04.424378+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114401280 unmapped: 26173440 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440168 data_alloc: 234881024 data_used: 9789440
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:05.424535+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 25985024 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa3000/0x0/0x4ffc00000, data 0x26f10ca/0x27b9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [0,0,1])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:06.424823+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114794496 unmapped: 25780224 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:07.425028+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:08.425226+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:09.425439+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa3000/0x0/0x4ffc00000, data 0x26f10ca/0x27b9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440509 data_alloc: 234881024 data_used: 9789440
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:10.425607+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:11.425748+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116137984 unmapped: 24436736 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:12.425961+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:13.426200+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:14.426392+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f840e000/0x0/0x4ffc00000, data 0x2d700ca/0x2e38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1504109 data_alloc: 234881024 data_used: 9850880
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:15.426609+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a560c25680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef65a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117350400 unmapped: 23224320 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.155679703s of 12.892781258s, submitted: 488
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:16.426871+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116834304 unmapped: 23740416 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:17.427067+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116834304 unmapped: 23740416 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:18.427235+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:19.427420+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8402000/0x0/0x4ffc00000, data 0x2d920ca/0x2e5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1496421 data_alloc: 234881024 data_used: 9854976
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:20.427619+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8402000/0x0/0x4ffc00000, data 0x2d920ca/0x2e5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:21.427915+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:22.428275+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116965376 unmapped: 23609344 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:23.428579+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116965376 unmapped: 23609344 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:24.428800+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f83f3000/0x0/0x4ffc00000, data 0x2da10ca/0x2e69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 23527424 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1496237 data_alloc: 234881024 data_used: 9854976
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:25.428959+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 23527424 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.086823463s of 10.129245758s, submitted: 9
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:26.429154+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a5601c8780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55f9181e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116473856 unmapped: 24100864 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560c22f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:27.429343+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:28.429635+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:29.429905+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1426840 data_alloc: 218103808 data_used: 9109504
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:30.430075+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:31.430250+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:32.430532+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:33.430747+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:34.430948+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:35.431153+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1426840 data_alloc: 218103808 data_used: 9109504
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:36.431429+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:37.431695+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b84000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:38.431992+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.182063103s of 12.550888062s, submitted: 72
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:39.432146+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:40.432318+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1424015 data_alloc: 218103808 data_used: 9109504
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5601ca780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560c24f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:41.432445+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b84000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 29704192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a563071680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:42.432608+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:43.432866+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:44.433099+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:45.433308+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:46.433567+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:47.433757+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:48.433920+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:49.434123+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:50.434477+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:51.434877+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:52.435162+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:53.435418+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:54.435793+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:55.436034+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:56.436326+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:57.436540+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:58.436731+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2bc00 session 0x55a5627841e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a562c56000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:59.436955+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:00.437120+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:01.437346+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:02.437821+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:03.437980+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:04.438177+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:05.438340+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:06.438587+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:07.438998+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:08.439212+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:09.439354+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 30.355859756s of 30.541212082s, submitted: 59
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 29704192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:10.439566+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245655 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110878720 unmapped: 29696000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:11.439859+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110878720 unmapped: 29696000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562e57680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a56327a5a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a560a07a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:12.439993+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562cc3800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc3800 session 0x55a5603c94a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560a072c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562785e00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a560ef6000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a56370cb40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562cc3800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc3800 session 0x55a560219860
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:13.440295+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:14.440593+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:15.440820+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339283 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112795648 unmapped: 31449088 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:16.441114+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560e914a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112803840 unmapped: 31440896 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:17.441361+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 31277056 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:18.441552+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [1])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:19.441787+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:20.441978+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1418779 data_alloc: 234881024 data_used: 12140544
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:21.442206+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:22.443884+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560b465a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a5635954a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.497446060s of 13.612625122s, submitted: 20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a56021fe00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:23.444096+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:24.444309+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:25.445084+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:26.445526+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:27.445703+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:28.445902+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:29.446414+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:30.446826+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:31.447199+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:32.447453+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:33.447739+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:34.448423+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:35.448591+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 33759232 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:36.448860+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 33759232 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022dc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.943146706s of 13.990984917s, submitted: 17
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022dc00 session 0x55a56387e780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562c570e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:37.449076+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5602230e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a563752d20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a562f19e00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 33554432 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:38.449332+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 33554432 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:39.449524+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9819000/0x0/0x4ffc00000, data 0x197d045/0x1a43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:40.449708+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1289281 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a563d89000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563d89000 session 0x55a560219a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:41.449934+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a562e56f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a562f1f680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563752000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560c24000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:42.450071+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a5603c9a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:43.450289+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:44.450473+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:45.450736+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 32890880 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298244 data_alloc: 218103808 data_used: 1359872
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:46.450970+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:47.451217+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:48.451465+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563070960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.371298790s of 11.489388466s, submitted: 37
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a56021fa40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562581400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:49.451709+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110919680 unmapped: 33325056 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a56017a960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:50.451934+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 33308672 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: mgrc ms_handle_reset ms_handle_reset con 0x55a561cb3c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/4198923246
Jan 23 10:30:15 compute-1 ceph-osd[77616]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/4198923246,v1:192.168.122.100:6801/4198923246]
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: get_auth_request con 0x55a561d56000 auth_method 0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: mgrc handle_mgr_configure stats_period=5
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:51.452077+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d68c00 session 0x55a562e5f680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a5621f9800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56016f400 session 0x55a562ed23c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d43400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:52.452475+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:53.452755+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:54.453422+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a55f6d90e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:55.453617+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:56.453780+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:57.454014+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:58.454837+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:59.455068+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:00.455353+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:01.455617+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:02.456031+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:03.456205+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:04.456619+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:05.456756+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b60c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.596210480s of 16.945894241s, submitted: 37
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260315 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:06.457254+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:07.457414+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:08.457712+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:09.457892+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:10.458069+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261695 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:11.458218+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:12.458435+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:13.458655+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:14.458963+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:15.459101+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261695 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:16.459362+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:17.459541+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111124480 unmapped: 33120256 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:18.459688+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111124480 unmapped: 33120256 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:19.459930+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.821660995s of 13.956790924s, submitted: 3
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:20.460330+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261563 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:21.460478+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:22.460724+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:23.460917+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:24.461147+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:25.461361+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261563 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:26.461607+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:27.461746+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58400 session 0x55a5601ca3c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560b46780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a55f919860
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562c563c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562581400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a56370cd20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:28.462062+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:29.462303+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:30.462618+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a563d89800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563d89800 session 0x55a5630dad20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303917 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96ee000/0x0/0x4ffc00000, data 0x1aa8045/0x1b6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:31.462772+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 32940032 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560a9fa40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:32.462949+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5601ca3c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.447840691s of 13.576416016s, submitted: 37
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111329280 unmapped: 32915456 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:33.463094+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:34.463374+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5601caf00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562581400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562cc2000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:35.463543+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307891 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:36.463748+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:37.463926+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110780416 unmapped: 33464320 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:38.464095+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:39.464271+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:40.464487+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342547 data_alloc: 218103808 data_used: 5341184
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:41.464753+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:42.464990+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:43.465235+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:44.465483+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:45.465707+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342547 data_alloc: 218103808 data_used: 5341184
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111091712 unmapped: 33153024 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:46.466128+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111099904 unmapped: 33144832 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:47.466376+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.020989418s of 15.036432266s, submitted: 3
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112001024 unmapped: 32243712 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:48.466589+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115662848 unmapped: 28581888 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:49.466772+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:50.466943+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:51.467062+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:52.467207+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:53.467417+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:54.467600+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:55.467775+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:56.467967+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:57.468164+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:58.468331+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:59.468865+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:00.469015+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:01.469193+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.164536476s of 14.331671715s, submitted: 56
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a5601cba40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:02.469352+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc2000 session 0x55a5601c81e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a560219e00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:03.469589+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:04.470032+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:05.470276+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:06.470654+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:07.470844+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:08.471189+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:09.471429+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:10.471671+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:11.471839+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:12.472199+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a56021f860
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a562ed21e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:13.472680+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:14.472881+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:15.473043+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:16.473258+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:17.473555+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:18.473887+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:19.474079+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:20.474421+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:21.474740+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:22.475021+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:23.475166+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:24.475341+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562213c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.203777313s of 22.330352783s, submitted: 42
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213c00 session 0x55a5601ca960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562215800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562215800 session 0x55a562ed2b40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a55fee70e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560220000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a562e5f2c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:25.475572+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9689000/0x0/0x4ffc00000, data 0x1b0dfe3/0x1bd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343500 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:26.475888+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:27.476037+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:28.476274+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:29.476545+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562202800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202800 session 0x55a562784780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56255c400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a563610800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111263744 unmapped: 32980992 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:30.476692+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343804 data_alloc: 218103808 data_used: 339968
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111198208 unmapped: 33046528 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:31.476869+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:32.477065+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:33.477280+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:34.477607+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:35.477740+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399892 data_alloc: 218103808 data_used: 8769536
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:36.477917+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.868194580s of 11.951797485s, submitted: 14
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:37.478279+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:38.478455+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:39.478595+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:40.478749+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399760 data_alloc: 218103808 data_used: 8769536
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:41.478931+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:42.479091+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 28647424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:43.479255+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 22994944 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:44.479554+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 22970368 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d6bc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d6bc00 session 0x55a563071680
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a563070960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560c881e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:45.479763+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a560c883c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 119250944 unmapped: 24993792 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562202800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202800 session 0x55a5601ca780
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56800 session 0x55a560ef6960
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a561048d20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a5610485a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2b000 session 0x55a5610492c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1586678 data_alloc: 234881024 data_used: 9949184
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:46.480025+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:47.480197+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:48.480393+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:49.480564+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:50.480723+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 31072256 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562213800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213800 session 0x55a55fee6f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.895648956s of 14.161753654s, submitted: 94
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56257a400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56257a400 session 0x55a55fee7a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1586694 data_alloc: 234881024 data_used: 9949184
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:51.481046+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2b000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2b000 session 0x55a55fee6d20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a55fee7c20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:52.481176+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562213800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:53.481313+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:54.481444+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 27189248 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:55.481680+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1666876 data_alloc: 234881024 data_used: 21913600
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:56.481881+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:57.483201+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:58.483563+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:59.483787+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:00.484074+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:01.484216+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1666876 data_alloc: 234881024 data_used: 21913600
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:02.484391+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:03.484575+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:04.484988+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.947762489s of 13.955580711s, submitted: 2
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:05.488650+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134848512 unmapped: 17276928 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f777a000/0x0/0x4ffc00000, data 0x360bff3/0x36d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6b8d000/0x0/0x4ffc00000, data 0x41f0ff3/0x42b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:06.489897+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759978 data_alloc: 234881024 data_used: 22151168
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134922240 unmapped: 17203200 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:07.490049+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:08.491419+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:09.491651+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6af2000/0x0/0x4ffc00000, data 0x4293ff3/0x435a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:10.494253+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134946816 unmapped: 17178624 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:11.494515+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1771750 data_alloc: 234881024 data_used: 22212608
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134946816 unmapped: 17178624 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:12.494818+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135184384 unmapped: 16941056 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ace000/0x0/0x4ffc00000, data 0x42b7ff3/0x437e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:13.494979+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135217152 unmapped: 16908288 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:14.496590+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135217152 unmapped: 16908288 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:15.497201+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135225344 unmapped: 16900096 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:16.497385+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1770406 data_alloc: 234881024 data_used: 22220800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135225344 unmapped: 16900096 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ace000/0x0/0x4ffc00000, data 0x42b7ff3/0x437e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:17.497583+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135258112 unmapped: 16867328 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:18.497738+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135258112 unmapped: 16867328 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.216358185s of 14.454858780s, submitted: 91
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:19.497905+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135315456 unmapped: 16809984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:20.498267+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135315456 unmapped: 16809984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:21.498456+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1770158 data_alloc: 234881024 data_used: 22220800
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:22.498762+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:23.499039+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a5603c92c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213800 session 0x55a560ef7a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:24.499215+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55f6c5400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135340032 unmapped: 16785408 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:25.499469+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135340032 unmapped: 16785408 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:26.499871+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1512596 data_alloc: 234881024 data_used: 9957376
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8392000/0x0/0x4ffc00000, data 0x29f3ff3/0x2aba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:27.500155+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:28.500403+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55f6c5400 session 0x55a561048b40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:29.500913+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:30.501234+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:31.501436+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:32.501729+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:33.502172+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:34.502392+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:35.502584+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:36.505136+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:37.505346+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:38.505550+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:39.505687+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:40.505864+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255c400 session 0x55a560ef72c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:41.506046+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563610800 session 0x55a560c24f00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:42.506228+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56255cc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.984561920s of 24.057754517s, submitted: 24
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:43.506412+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:44.506590+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:45.506803+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:46.507008+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:47.507223+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255cc00 session 0x55a560219860
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:48.507393+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:49.507600+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:50.507797+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:51.507970+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:52.508118+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:53.508306+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:54.508487+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:55.508790+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:56.508980+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:57.509217+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:58.509423+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:59.509628+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:00.509797+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:01.509974+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:02.510139+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:03.510415+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:04.510579+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:05.510776+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:06.510975+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:07.511113+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:08.511291+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:09.511736+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:10.511912+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:11.512061+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:12.512178+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:13.512581+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:14.512751+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:15.512958+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:16.513153+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:17.513459+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:18.513689+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:19.513992+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:20.514153+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:21.514395+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:22.514594+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562214c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:23.514984+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.494945526s of 40.619098663s, submitted: 20
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:24.515143+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120668160 unmapped: 31457280 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:25.515348+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120283136 unmapped: 31842304 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214c00 session 0x55a560ef7860
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a5631e0c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5631e0c00 session 0x55a562c565a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562214c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214c00 session 0x55a560218000
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:26.515585+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56255c400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1358694 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:27.515809+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:28.516003+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255c400 session 0x55a560219e00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56255cc00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255cc00 session 0x55a560ef65a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:29.516286+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:30.516557+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a5631e0c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:31.516842+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357294 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:32.517072+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:33.517255+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.344947815s of 10.433979988s, submitted: 23
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:34.517452+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5631e0c00 session 0x55a563071a40
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:35.517648+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562202c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c53400
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:36.517897+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359231 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:37.518114+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120242176 unmapped: 31883264 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:38.518330+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:39.518660+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:40.518790+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:41.519002+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408783 data_alloc: 218103808 data_used: 7593984
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:42.519123+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:43.519260+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:44.519357+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:45.519526+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:46.519693+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408783 data_alloc: 218103808 data_used: 7593984
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:47.519859+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:48.519995+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.895034790s of 14.912478447s, submitted: 5
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:49.520176+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123174912 unmapped: 28950528 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:50.520322+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:51.520607+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1462949 data_alloc: 218103808 data_used: 7610368
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:52.520779+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f895c000/0x0/0x4ffc00000, data 0x242a006/0x24f0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:53.520921+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:54.521098+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:55.521231+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:56.521435+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1467605 data_alloc: 218103808 data_used: 7610368
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:57.521641+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:58.521818+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8863000/0x0/0x4ffc00000, data 0x2523006/0x25e9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:59.521975+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:00.522108+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.162461281s of 11.450368881s, submitted: 61
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126320640 unmapped: 25804800 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:01.522239+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1473525 data_alloc: 218103808 data_used: 7856128
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:02.522388+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x25a0006/0x2666000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:03.522554+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:04.522713+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:05.522918+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x25a0006/0x2666000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:06.523125+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1476015 data_alloc: 218103808 data_used: 7860224
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:07.523337+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:08.523513+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:09.523677+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:10.523859+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87c4000/0x0/0x4ffc00000, data 0x25c2006/0x2688000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.021368027s of 10.088058472s, submitted: 14
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202c00 session 0x55a5602214a0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c53400 session 0x55a560a9f0e0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562215c00
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 28925952 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:11.524010+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308597 data_alloc: 218103808 data_used: 311296
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 28925952 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:12.524188+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562215c00 session 0x55a5630703c0
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:13.524602+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:14.525109+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:15.525415+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:16.525855+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:17.526086+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:18.526258+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:19.526482+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:20.526773+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:21.527023+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:22.527313+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:23.527796+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:24.528085+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:25.528460+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:26.528681+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:27.529017+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:28.529215+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:29.529454+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:30.529672+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:31.529989+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:32.530228+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:33.530376+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:34.530612+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:35.530749+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:36.530984+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:37.531213+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:38.531559+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:39.531783+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:40.531981+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:41.532158+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:42.532336+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:43.532523+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:44.532677+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:45.532812+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:46.533037+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:47.533213+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:48.533414+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:49.533590+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:50.533763+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:51.534243+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:52.534414+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:53.534606+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:54.534843+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:55.535065+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:56.535346+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:57.535599+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:58.535820+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:59.536056+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:00.536303+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:01.536585+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:02.536775+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:03.536977+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:04.537206+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:05.537395+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:06.537686+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:07.537942+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:08.538076+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:09.538375+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:10.538570+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:11.538798+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:12.538970+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:13.539135+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:14.539325+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:15.539664+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:16.539929+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:17.540125+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:18.540380+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:19.540586+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:20.540777+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:21.541029+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2404.3 total, 600.0 interval
                                           Cumulative writes: 13K writes, 49K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s
                                           Cumulative WAL: 13K writes, 4008 syncs, 3.45 writes per sync, written: 0.04 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 3081 writes, 10K keys, 3081 commit groups, 1.0 writes per commit group, ingest: 9.92 MB, 0.02 MB/s
                                           Interval WAL: 3081 writes, 1324 syncs, 2.33 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:22.541272+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:23.541460+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:24.541668+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:25.541886+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:26.542160+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:27.542352+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:28.542574+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:29.542732+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:30.542898+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:31.543041+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:32.543234+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:33.543467+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:34.543683+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:35.543859+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:36.544123+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:37.544315+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:38.544470+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:39.544644+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:40.544758+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:41.544885+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122044416 unmapped: 30081024 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:30:15 compute-1 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}'
Jan 23 10:30:15 compute-1 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 10:30:15 compute-1 ceph-osd[77616]: do_command 'config show' '{prefix=config show}'
Jan 23 10:30:15 compute-1 ceph-osd[77616]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:42.545051+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:30:15 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:30:15 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:30:15 compute-1 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 10:30:15 compute-1 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 10:30:15 compute-1 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 10:30:15 compute-1 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122052608 unmapped: 30072832 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:43.545191+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 30474240 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:30:15 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:44.545324+0000)
Jan 23 10:30:15 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 30359552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:30:15 compute-1 ceph-osd[77616]: do_command 'log dump' '{prefix=log dump}'
Jan 23 10:30:15 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 23 10:30:15 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/2952901814' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 10:30:15 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4201206189' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:30:15 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1559929094' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:15 compute-1 nova_compute[225705]: 2026-01-23 10:30:15.562 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.581s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:30:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:15.616 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:15 compute-1 nova_compute[225705]: 2026-01-23 10:30:15.761 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:30:15 compute-1 nova_compute[225705]: 2026-01-23 10:30:15.762 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4764MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:30:15 compute-1 nova_compute[225705]: 2026-01-23 10:30:15.762 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:30:15 compute-1 nova_compute[225705]: 2026-01-23 10:30:15.763 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:30:15 compute-1 nova_compute[225705]: 2026-01-23 10:30:15.870 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:30:15 compute-1 nova_compute[225705]: 2026-01-23 10:30:15.870 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:30:15 compute-1 nova_compute[225705]: 2026-01-23 10:30:15.889 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:30:15 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 10:30:15 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3107004277' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.25985 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.16494 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.26003 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3571602394' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4293863724' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: pgmap v1114: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3550185203' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.26027 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2261349115' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2952901814' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/834363629' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4201206189' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/644955657' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1559929094' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:16 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:30:16 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4004933432' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:16 compute-1 nova_compute[225705]: 2026-01-23 10:30:16.402 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:30:16 compute-1 nova_compute[225705]: 2026-01-23 10:30:16.408 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:30:16 compute-1 nova_compute[225705]: 2026-01-23 10:30:16.425 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:30:16 compute-1 nova_compute[225705]: 2026-01-23 10:30:16.427 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:30:16 compute-1 nova_compute[225705]: 2026-01-23 10:30:16.427 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:30:16 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 10:30:16 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1506185174' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:30:16 compute-1 crontab[241215]: (root) LIST (root)
Jan 23 10:30:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.16518 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.25987 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.26048 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.16539 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.25993 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3107004277' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.26002 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2029648398' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.26066 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.16548 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2687778899' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4004933432' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/129791328' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1506185174' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4224081289' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:30:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3938838710' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:17.066 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:17 compute-1 nova_compute[225705]: 2026-01-23 10:30:17.087 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 23 10:30:17 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/873021322' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:30:17 compute-1 nova_compute[225705]: 2026-01-23 10:30:17.428 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:17 compute-1 nova_compute[225705]: 2026-01-23 10:30:17.428 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:17 compute-1 nova_compute[225705]: 2026-01-23 10:30:17.429 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:17 compute-1 nova_compute[225705]: 2026-01-23 10:30:17.429 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:17 compute-1 nova_compute[225705]: 2026-01-23 10:30:17.429 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:30:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:17.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 23 10:30:17 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/753443402' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.26084 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.16569 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.26029 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.26032 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/737140567' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: pgmap v1115: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3169827845' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/873021322' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/232885380' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/753443402' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/790870186' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 23 10:30:18 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2692355636' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 23 10:30:18 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2848004950' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:30:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:18 compute-1 nova_compute[225705]: 2026-01-23 10:30:18.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:30:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 23 10:30:18 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3799754458' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 23 10:30:19 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1425965098' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.26111 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.26056 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.16587 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.26126 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.26071 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.16593 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.26141 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.26089 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2692355636' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/641666540' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2848004950' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2869222915' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3337747780' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:30:19 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3799754458' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:30:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:19.069 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 23 10:30:19 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/641995280' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:30:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:19.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:19 compute-1 podman[241574]: 2026-01-23 10:30:19.709936464 +0000 UTC m=+0.107902250 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Jan 23 10:30:19 compute-1 systemd[1]: Starting Hostname Service...
Jan 23 10:30:19 compute-1 nova_compute[225705]: 2026-01-23 10:30:19.863 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 23 10:30:19 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1463104035' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:30:19 compute-1 systemd[1]: Started Hostname Service.
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.16617 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.26156 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.26098 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.16635 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.26113 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1425965098' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: pgmap v1116: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.16653 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/457950240' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/641995280' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3233910781' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2946892865' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1463104035' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3682934773' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd metadata"} v 0)
Jan 23 10:30:20 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/331196613' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 23 10:30:20 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1013832290' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 23 10:30:20 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/758305960' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:30:20 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Jan 23 10:30:20 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/15662595' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:21.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 23 10:30:21 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2098994895' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:30:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:21.625 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:22 compute-1 nova_compute[225705]: 2026-01-23 10:30:22.088 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:22 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 23 10:30:22 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1090320711' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:30:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:23.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 23 10:30:23 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3613864487' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:30:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:23.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 10:30:23 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1692553818' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.16668 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.26137 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2112946453' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1755085481' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/341034865' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/331196613' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1013832290' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.26158 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/256833078' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4062298466' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1621252379' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/758305960' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.26173 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1191692492' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/15662595' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: pgmap v1117: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1439010025' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:30:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:24 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 23 10:30:24 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3854792418' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 23 10:30:24 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3166484618' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:24 compute-1 nova_compute[225705]: 2026-01-23 10:30:24.866 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:25.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:25 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 23 10:30:25 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1855905813' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:30:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:25.630 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.26231 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1786861485' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2098994895' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2493257669' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.26243 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1879524475' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2855196845' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1569001041' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/526295899' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.16767 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1090320711' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3694179319' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: pgmap v1118: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.16779 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3613864487' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.16785 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2294747835' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1692553818' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.16797 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2531849906' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:26 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:26 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:27 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:27.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:27 compute-1 nova_compute[225705]: 2026-01-23 10:30:27.091 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:27 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3854792418' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3166484618' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.16809 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1648245877' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.16821 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2182990666' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: pgmap v1119: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.16836 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/340871671' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1855905813' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2836282561' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.16866 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.16887 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4080595786' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2610036103' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.26321 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3569729491' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:27 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:27 compute-1 ceph-mon[80126]: pgmap v1120: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:27 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 23 10:30:27 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/931928461' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:30:27 compute-1 sudo[242444]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:30:27 compute-1 sudo[242444]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:27 compute-1 sudo[242444]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:27.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 23 10:30:28 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.100:0/292578968' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 23 10:30:28 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3039266935' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.26357 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/896818155' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2246982503' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/931928461' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2736972453' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.26387 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.16956 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2037715614' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.26393 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/26149318' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.26399 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1004606240' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/292578968' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3039266935' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:30:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 23 10:30:28 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/584663929' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:30:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:29.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:29 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Jan 23 10:30:29 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2920006099' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.26414 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4061140840' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/631308360' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/584663929' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3849795310' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.26426 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1961868252' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3026965908' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: pgmap v1121: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2920006099' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:30:29 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3687676278' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:30:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:29.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:29 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Jan 23 10:30:29 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1049295026' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 10:30:29 compute-1 nova_compute[225705]: 2026-01-23 10:30:29.868 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:30 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat"} v 0)
Jan 23 10:30:30 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1786797257' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 10:30:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:31.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:31.639 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:32 compute-1 ceph-mon[80126]: from='client.26438 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:32 compute-1 ceph-mon[80126]: from='client.26335 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1997317476' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:30:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1049295026' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 10:30:32 compute-1 ceph-mon[80126]: from='client.26347 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3723003283' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 10:30:32 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/629032271' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:30:32 compute-1 ceph-mon[80126]: from='client.26359 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:32 compute-1 ceph-mon[80126]: from='client.26450 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:32 compute-1 nova_compute[225705]: 2026-01-23 10:30:32.093 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:32 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump"} v 0)
Jan 23 10:30:32 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1709192928' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 10:30:33 compute-1 ceph-mon[80126]: from='client.16995 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:33 compute-1 ceph-mon[80126]: from='client.26365 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:33 compute-1 ceph-mon[80126]: from='client.26371 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:33 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1786797257' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 10:30:33 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3321049884' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 10:30:33 compute-1 ceph-mon[80126]: from='client.26380 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:33 compute-1 ceph-mon[80126]: pgmap v1122: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:33 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3112698172' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 10:30:33 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1709192928' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 10:30:33 compute-1 ovs-appctl[243650]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 10:30:33 compute-1 ovs-appctl[243655]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 10:30:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:33.136 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:33 compute-1 ovs-appctl[243663]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Jan 23 10:30:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls"} v 0)
Jan 23 10:30:33 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/368403482' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 10:30:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:33.641 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:34 compute-1 ceph-mon[80126]: from='client.26389 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:34 compute-1 ceph-mon[80126]: from='client.26395 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:34 compute-1 ceph-mon[80126]: from='client.26477 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:34 compute-1 ceph-mon[80126]: pgmap v1123: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:34 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3510438922' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 10:30:34 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3880234350' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:30:34 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/368403482' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 10:30:34 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3916140662' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:30:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump"} v 0)
Jan 23 10:30:34 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2586594643' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 10:30:34 compute-1 nova_compute[225705]: 2026-01-23 10:30:34.907 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:35 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status"} v 0)
Jan 23 10:30:35 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2023447290' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 10:30:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:35.139 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.26404 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.17031 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.26416 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.26489 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.17040 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2654627831' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/965967208' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2586594643' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1237615592' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3359416428' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2023447290' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:35 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:35.644 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:35 compute-1 podman[244677]: 2026-01-23 10:30:35.667012025 +0000 UTC m=+0.068533442 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Jan 23 10:30:36 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:36 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail"} v 0)
Jan 23 10:30:36 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/514541159' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='client.26498 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: pgmap v1124: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='client.17085 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='client.26525 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='client.17094 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='client.26534 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/112016515' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/514541159' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:30:36 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:30:36 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat"} v 0)
Jan 23 10:30:36 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2493126188' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 10:30:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:37.142 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:37 compute-1 nova_compute[225705]: 2026-01-23 10:30:37.150 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:37.646 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:37 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2493126188' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 10:30:37 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3084282709' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 10:30:37 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/25636672' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:30:37 compute-1 ceph-mon[80126]: from='client.26567 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:37 compute-1 ceph-mon[80126]: from='client.17130 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:37 compute-1 ceph-mon[80126]: from='client.26573 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:37 compute-1 ceph-mon[80126]: pgmap v1125: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:37 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 10:30:37 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1176070681' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status"} v 0)
Jan 23 10:30:38 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/571241287' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Jan 23 10:30:38 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2918126809' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.26579 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.17139 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1362458980' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1176070681' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2010755244' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/419775306' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/571241287' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2274561256' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2918126809' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/372859611' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3820475493' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:39.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:39 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 10:30:39 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1796333743' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:39.647 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:39 compute-1 nova_compute[225705]: 2026-01-23 10:30:39.910 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json-pretty"} v 0)
Jan 23 10:30:40 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1655420808' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 10:30:40 compute-1 ceph-mon[80126]: from='client.26606 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:40 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1920671338' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 10:30:40 compute-1 ceph-mon[80126]: from='client.17172 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:40 compute-1 ceph-mon[80126]: pgmap v1126: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:40 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1796333743' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:40 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/49428748' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:40 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump", "format": "json-pretty"} v 0)
Jan 23 10:30:40 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2704307379' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls", "format": "json-pretty"} v 0)
Jan 23 10:30:41 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2288402487' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:41.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.26590 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4128500998' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/50610532' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1655420808' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2904729046' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2704307379' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1965759482' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2550268996' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2288402487' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: pgmap v1127: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.26623 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:41 compute-1 ceph-mon[80126]: from='client.17208 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:41.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:41 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mds stat", "format": "json-pretty"} v 0)
Jan 23 10:30:41 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/560283955' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:42 compute-1 nova_compute[225705]: 2026-01-23 10:30:42.151 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:42 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon dump", "format": "json-pretty"} v 0)
Jan 23 10:30:42 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3879520383' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-1 ceph-mon[80126]: from='client.26642 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/22422002' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 10:30:42 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2641249001' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/560283955' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-1 ceph-mon[80126]: from='client.26641 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:42 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/114055830' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:42 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3879520383' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:43.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd blocklist ls", "format": "json-pretty"} v 0)
Jan 23 10:30:43 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3587918853' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:43.651 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:43 compute-1 ceph-mon[80126]: from='client.26650 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:43 compute-1 ceph-mon[80126]: from='client.17229 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:43 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1755609551' entity='client.admin' cmd=[{"prefix": "osd dump"}]: dispatch
Jan 23 10:30:43 compute-1 ceph-mon[80126]: from='client.26663 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:43 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/946102600' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:43 compute-1 ceph-mon[80126]: pgmap v1128: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:43 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3090469002' entity='client.admin' cmd=[{"prefix": "osd numa-status"}]: dispatch
Jan 23 10:30:43 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3587918853' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:44 compute-1 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 10:30:44 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd dump", "format": "json-pretty"} v 0)
Jan 23 10:30:44 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1756811035' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:44 compute-1 nova_compute[225705]: 2026-01-23 10:30:44.959 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:45 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd numa-status", "format": "json-pretty"} v 0)
Jan 23 10:30:45 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/634711209' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.26674 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.17244 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.26675 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.26683 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.17250 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.26681 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3100141440' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4118000991' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail"}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1756811035' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2020289235' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:45 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3681758551' entity='client.admin' cmd=[{"prefix": "osd stat"}]: dispatch
Jan 23 10:30:45 compute-1 systemd[1]: Starting Time & Date Service...
Jan 23 10:30:45 compute-1 sudo[245890]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:30:45 compute-1 sudo[245890]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:45 compute-1 sudo[245890]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:45 compute-1 sudo[245926]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 10:30:45 compute-1 sudo[245926]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:45.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:45 compute-1 systemd[1]: Started Time & Date Service.
Jan 23 10:30:45 compute-1 sudo[245926]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:45.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:45 compute-1 sudo[246036]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:30:45 compute-1 sudo[246036]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:45 compute-1 sudo[246036]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:45 compute-1 sudo[246085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:30:45 compute-1 sudo[246085]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:46 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/634711209' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:46 compute-1 ceph-mon[80126]: pgmap v1129: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:30:46 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:46 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:47 compute-1 sudo[246085]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:47 compute-1 nova_compute[225705]: 2026-01-23 10:30:47.153 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:47.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 10:30:47 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1725430918' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:30:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Cumulative writes: 6530 writes, 35K keys, 6530 commit groups, 1.0 writes per commit group, ingest: 0.09 GB, 0.04 MB/s
                                           Cumulative WAL: 6530 writes, 6530 syncs, 1.00 writes per sync, written: 0.09 GB, 0.04 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1542 writes, 7888 keys, 1542 commit groups, 1.0 writes per commit group, ingest: 17.81 MB, 0.03 MB/s
                                           Interval WAL: 1542 writes, 1542 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0     61.0      0.80              0.15        18    0.045       0      0       0.0       0.0
                                             L6      1/0   13.93 MB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   4.4    118.4    101.7      2.09              0.60        17    0.123     94K   9317       0.0       0.0
                                            Sum      1/0   13.93 MB   0.0      0.2     0.0      0.2       0.3      0.1       0.0   5.4     85.6     90.4      2.90              0.75        35    0.083     94K   9317       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.0       0.1      0.0       0.0   5.6     96.0     98.7      0.64              0.18         8    0.081     26K   2540       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.2       0.2      0.0       0.0   0.0    118.4    101.7      2.09              0.60        17    0.123     94K   9317       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0     61.1      0.80              0.15        17    0.047       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 2400.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.048, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.26 GB write, 0.11 MB/s write, 0.24 GB read, 0.10 MB/s read, 2.9 seconds
                                           Interval compaction: 0.06 GB write, 0.11 MB/s write, 0.06 GB read, 0.10 MB/s read, 0.6 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 22.75 MB table_size: 0 occupancy: 18446744073709551615 collections: 5 last_copies: 0 last_secs: 0.000178 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1398,22.02 MB,7.24434%) FilterBlock(35,274.42 KB,0.0881546%) IndexBlock(35,473.86 KB,0.152221%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:30:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump", "format": "json-pretty"} v 0)
Jan 23 10:30:47 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.102:0/3995885793' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 sudo[246262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:30:47 compute-1 sudo[246262]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:47 compute-1 sudo[246262]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:47.654 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd stat", "format": "json-pretty"} v 0)
Jan 23 10:30:47 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1993038481' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.17274 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.26710 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.26705 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.17280 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.26714 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.26716 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2973156896' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1942062714' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1869920413' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.17301 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3530506813' entity='client.admin' cmd=[{"prefix": "time-sync-status"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: pgmap v1130: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 769 B/s rd, 0 op/s
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1725430918' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3995885793' entity='client.admin' cmd=[{"prefix": "config dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1993038481' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1944953467' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status", "format": "json-pretty"} v 0)
Jan 23 10:30:48 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3762032723' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:49.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:49 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "time-sync-status", "format": "json-pretty"} v 0)
Jan 23 10:30:49 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/736282130' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:49.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:49 compute-1 ceph-mon[80126]: from='client.17307 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-1 ceph-mon[80126]: from='client.26740 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-1 ceph-mon[80126]: from='client.26741 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/955048904' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2304269384' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2448494540' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:30:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2448494540' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:30:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3049937054' entity='client.admin' cmd=[{"prefix": "df", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3762032723' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:49 compute-1 nova_compute[225705]: 2026-01-23 10:30:49.962 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:50 compute-1 podman[246382]: 2026-01-23 10:30:50.712785068 +0000 UTC m=+0.109211701 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 10:30:50 compute-1 ceph-mon[80126]: from='client.26747 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:50 compute-1 ceph-mon[80126]: pgmap v1131: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 513 B/s rd, 0 op/s
Jan 23 10:30:50 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/966591196' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:50 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/736282130' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:30:50 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3027541159' entity='client.admin' cmd=[{"prefix": "fs ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:51.160 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:51.658 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:51 compute-1 ceph-mon[80126]: from='client.26785 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:51 compute-1 ceph-mon[80126]: pgmap v1132: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 769 B/s rd, 0 op/s
Jan 23 10:30:51 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2631473347' entity='client.admin' cmd=[{"prefix": "mds stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:51 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3298491043' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:52 compute-1 nova_compute[225705]: 2026-01-23 10:30:52.158 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:53 compute-1 ceph-mon[80126]: from='client.26803 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:53 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/814440401' entity='client.admin' cmd=[{"prefix": "osd blocklist ls", "format": "json-pretty"}]: dispatch
Jan 23 10:30:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:53.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:53 compute-1 sudo[246413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:30:53 compute-1 sudo[246413]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:30:53 compute-1 sudo[246413]: pam_unix(sudo:session): session closed for user root
Jan 23 10:30:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:53.661 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:54 compute-1 ceph-mon[80126]: from='client.26815 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:54 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:54 compute-1 ceph-mon[80126]: pgmap v1133: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 512 B/s rd, 0 op/s
Jan 23 10:30:54 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:30:54 compute-1 ceph-mon[80126]: from='client.26821 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1998309136' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json-pretty"}]: dispatch
Jan 23 10:30:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2533419223' entity='client.admin' cmd=[{"prefix": "osd numa-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:54 compute-1 nova_compute[225705]: 2026-01-23 10:30:54.965 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:30:55.060 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:30:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:30:55.061 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:30:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:30:55.062 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:30:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:55.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:55.663 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:56 compute-1 ceph-mon[80126]: from='client.26839 -' entity='client.admin' cmd=[{"prefix": "osd perf", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:56 compute-1 ceph-mon[80126]: pgmap v1134: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 513 B/s rd, 0 op/s
Jan 23 10:30:56 compute-1 ceph-mon[80126]: from='client.26845 -' entity='client.admin' cmd=[{"prefix": "osd pool autoscale-status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:30:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:30:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:30:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:30:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:30:57 compute-1 nova_compute[225705]: 2026-01-23 10:30:57.161 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:30:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:57.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:30:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:57.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:30:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1831285676' entity='client.admin' cmd=[{"prefix": "osd pool ls", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:30:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2165199586' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json-pretty"}]: dispatch
Jan 23 10:30:57 compute-1 ceph-mon[80126]: pgmap v1135: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 769 B/s rd, 0 op/s
Jan 23 10:30:57 compute-1 ceph-mon[80126]: from='client.26863 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:30:59 compute-1 ceph-mon[80126]: from='client.26869 -' entity='client.admin' cmd=[{"prefix": "pg stat", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:30:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4212707187' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4094141977' entity='client.admin' cmd=[{"prefix": "time-sync-status", "format": "json-pretty"}]: dispatch
Jan 23 10:30:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:30:59.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:30:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:30:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:30:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:30:59.668 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:00 compute-1 nova_compute[225705]: 2026-01-23 10:31:00.023 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:00 compute-1 ceph-mon[80126]: pgmap v1136: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:01.179 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:01.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:02 compute-1 nova_compute[225705]: 2026-01-23 10:31:02.163 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:02 compute-1 ceph-mon[80126]: pgmap v1137: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:03.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:03 compute-1 ceph-mon[80126]: pgmap v1138: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:03.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:05 compute-1 nova_compute[225705]: 2026-01-23 10:31:05.027 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:31:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:05.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:05.675 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:06 compute-1 ceph-mon[80126]: pgmap v1139: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:06 compute-1 podman[246446]: 2026-01-23 10:31:06.663504745 +0000 UTC m=+0.066625095 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 10:31:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:07 compute-1 nova_compute[225705]: 2026-01-23 10:31:07.162 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:07.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:07.677 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:07 compute-1 sudo[246466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:31:07 compute-1 sudo[246466]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:07 compute-1 sudo[246466]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:08 compute-1 ceph-mon[80126]: pgmap v1140: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:09.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:09.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:09 compute-1 ceph-mon[80126]: pgmap v1141: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:10 compute-1 nova_compute[225705]: 2026-01-23 10:31:10.031 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:11.196 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:11.680 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:11 compute-1 nova_compute[225705]: 2026-01-23 10:31:11.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:11 compute-1 nova_compute[225705]: 2026-01-23 10:31:11.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:31:11 compute-1 nova_compute[225705]: 2026-01-23 10:31:11.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:31:11 compute-1 nova_compute[225705]: 2026-01-23 10:31:11.892 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:31:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:12 compute-1 nova_compute[225705]: 2026-01-23 10:31:12.164 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:12 compute-1 ceph-mon[80126]: pgmap v1142: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:12 compute-1 nova_compute[225705]: 2026-01-23 10:31:12.886 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:13.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:13 compute-1 ceph-mon[80126]: pgmap v1143: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:13.683 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:13 compute-1 nova_compute[225705]: 2026-01-23 10:31:13.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2905912035' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:14 compute-1 nova_compute[225705]: 2026-01-23 10:31:14.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:14 compute-1 nova_compute[225705]: 2026-01-23 10:31:14.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:31:14 compute-1 nova_compute[225705]: 2026-01-23 10:31:14.913 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:31:14 compute-1 nova_compute[225705]: 2026-01-23 10:31:14.914 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:14 compute-1 nova_compute[225705]: 2026-01-23 10:31:14.915 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:31:15 compute-1 nova_compute[225705]: 2026-01-23 10:31:15.071 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:15.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:15 compute-1 systemd[1]: systemd-timedated.service: Deactivated successfully.
Jan 23 10:31:15 compute-1 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Jan 23 10:31:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:15.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:15 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/147924141' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:15 compute-1 ceph-mon[80126]: pgmap v1144: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:15 compute-1 nova_compute[225705]: 2026-01-23 10:31:15.930 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:15 compute-1 nova_compute[225705]: 2026-01-23 10:31:15.957 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:31:15 compute-1 nova_compute[225705]: 2026-01-23 10:31:15.958 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:31:15 compute-1 nova_compute[225705]: 2026-01-23 10:31:15.958 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:31:15 compute-1 nova_compute[225705]: 2026-01-23 10:31:15.958 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:31:15 compute-1 nova_compute[225705]: 2026-01-23 10:31:15.959 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:31:16 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:31:16 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4255286849' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:16 compute-1 nova_compute[225705]: 2026-01-23 10:31:16.419 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.460s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:31:16 compute-1 nova_compute[225705]: 2026-01-23 10:31:16.611 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:31:16 compute-1 nova_compute[225705]: 2026-01-23 10:31:16.612 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4651MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:31:16 compute-1 nova_compute[225705]: 2026-01-23 10:31:16.612 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:31:16 compute-1 nova_compute[225705]: 2026-01-23 10:31:16.613 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:31:16 compute-1 nova_compute[225705]: 2026-01-23 10:31:16.692 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:31:16 compute-1 nova_compute[225705]: 2026-01-23 10:31:16.692 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:31:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4255286849' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:16 compute-1 nova_compute[225705]: 2026-01-23 10:31:16.819 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:31:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.167 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:17.206 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:31:17 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2961280680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.307 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.315 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.337 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.342 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.343 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.730s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:31:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:17.687 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:17 compute-1 nova_compute[225705]: 2026-01-23 10:31:17.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:17 compute-1 ceph-mon[80126]: pgmap v1145: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2961280680' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/177182351' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:18 compute-1 nova_compute[225705]: 2026-01-23 10:31:18.882 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:18 compute-1 nova_compute[225705]: 2026-01-23 10:31:18.907 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:18 compute-1 nova_compute[225705]: 2026-01-23 10:31:18.908 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:31:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/339495991' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:31:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:19.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:19.690 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:20 compute-1 nova_compute[225705]: 2026-01-23 10:31:20.074 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:20 compute-1 ceph-mon[80126]: pgmap v1146: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:20 compute-1 podman[246545]: 2026-01-23 10:31:20.864859389 +0000 UTC m=+0.099222794 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:31:20 compute-1 nova_compute[225705]: 2026-01-23 10:31:20.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:21.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:31:21 compute-1 ceph-mon[80126]: pgmap v1147: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:21.692 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:21 compute-1 nova_compute[225705]: 2026-01-23 10:31:21.775 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:31:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:22 compute-1 nova_compute[225705]: 2026-01-23 10:31:22.172 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:23.217 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:23.694 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:24 compute-1 ceph-mon[80126]: pgmap v1148: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:25 compute-1 nova_compute[225705]: 2026-01-23 10:31:25.077 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:25.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:25.697 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:26 compute-1 ceph-mon[80126]: pgmap v1149: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:27 compute-1 nova_compute[225705]: 2026-01-23 10:31:27.173 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:27.223 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:27 compute-1 ceph-mon[80126]: pgmap v1150: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:27.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:27 compute-1 sudo[246573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:31:27 compute-1 sudo[246573]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:27 compute-1 sudo[246573]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:29.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:29 compute-1 ceph-mon[80126]: pgmap v1151: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:29.702 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:30 compute-1 nova_compute[225705]: 2026-01-23 10:31:30.081 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:31.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:31.705 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:31 compute-1 ceph-mon[80126]: pgmap v1152: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:32 compute-1 nova_compute[225705]: 2026-01-23 10:31:32.175 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:33 compute-1 sshd-session[246600]: Invalid user sol from 45.148.10.240 port 59382
Jan 23 10:31:33 compute-1 sshd-session[246600]: Connection closed by invalid user sol 45.148.10.240 port 59382 [preauth]
Jan 23 10:31:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:33.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:33.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:34 compute-1 ceph-mon[80126]: pgmap v1153: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:35 compute-1 nova_compute[225705]: 2026-01-23 10:31:35.111 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:35.237 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:31:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:35.709 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:36 compute-1 ceph-mon[80126]: pgmap v1154: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:37 compute-1 nova_compute[225705]: 2026-01-23 10:31:37.176 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:37.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:37 compute-1 ceph-mon[80126]: pgmap v1155: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:37 compute-1 podman[246605]: 2026-01-23 10:31:37.68659216 +0000 UTC m=+0.085809713 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Jan 23 10:31:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:37.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:39.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:39.714 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:39 compute-1 sudo[239143]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:39 compute-1 sshd-session[239142]: Received disconnect from 192.168.122.10 port 56786:11: disconnected by user
Jan 23 10:31:39 compute-1 sshd-session[239142]: Disconnected from user zuul 192.168.122.10 port 56786
Jan 23 10:31:39 compute-1 sshd-session[239139]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:31:39 compute-1 systemd[1]: session-55.scope: Deactivated successfully.
Jan 23 10:31:39 compute-1 systemd[1]: session-55.scope: Consumed 2min 58.763s CPU time, 719.0M memory peak, read 257.5M from disk, written 64.5M to disk.
Jan 23 10:31:39 compute-1 systemd-logind[807]: Session 55 logged out. Waiting for processes to exit.
Jan 23 10:31:39 compute-1 systemd-logind[807]: Removed session 55.
Jan 23 10:31:40 compute-1 sshd-session[246625]: Accepted publickey for zuul from 192.168.122.10 port 56820 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:31:40 compute-1 systemd-logind[807]: New session 56 of user zuul.
Jan 23 10:31:40 compute-1 systemd[1]: Started Session 56 of User zuul.
Jan 23 10:31:40 compute-1 sshd-session[246625]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:31:40 compute-1 nova_compute[225705]: 2026-01-23 10:31:40.114 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:40 compute-1 sudo[246629]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/cat /var/tmp/sos-osp/sosreport-compute-1-2026-01-23-cggcnkh.tar.xz
Jan 23 10:31:40 compute-1 sudo[246629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:31:40 compute-1 sudo[246629]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:40 compute-1 sshd-session[246628]: Received disconnect from 192.168.122.10 port 56820:11: disconnected by user
Jan 23 10:31:40 compute-1 sshd-session[246628]: Disconnected from user zuul 192.168.122.10 port 56820
Jan 23 10:31:40 compute-1 sshd-session[246625]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:31:40 compute-1 systemd[1]: session-56.scope: Deactivated successfully.
Jan 23 10:31:40 compute-1 systemd-logind[807]: Session 56 logged out. Waiting for processes to exit.
Jan 23 10:31:40 compute-1 systemd-logind[807]: Removed session 56.
Jan 23 10:31:40 compute-1 ceph-mon[80126]: pgmap v1156: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:40 compute-1 sshd-session[246654]: Accepted publickey for zuul from 192.168.122.10 port 56826 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:31:40 compute-1 systemd-logind[807]: New session 57 of user zuul.
Jan 23 10:31:40 compute-1 systemd[1]: Started Session 57 of User zuul.
Jan 23 10:31:40 compute-1 sshd-session[246654]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:31:40 compute-1 sudo[246658]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rm -rf /var/tmp/sos-osp
Jan 23 10:31:40 compute-1 sudo[246658]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:31:40 compute-1 sudo[246658]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:40 compute-1 sshd-session[246657]: Received disconnect from 192.168.122.10 port 56826:11: disconnected by user
Jan 23 10:31:40 compute-1 sshd-session[246657]: Disconnected from user zuul 192.168.122.10 port 56826
Jan 23 10:31:40 compute-1 sshd-session[246654]: pam_unix(sshd:session): session closed for user zuul
Jan 23 10:31:40 compute-1 systemd[1]: session-57.scope: Deactivated successfully.
Jan 23 10:31:40 compute-1 systemd-logind[807]: Session 57 logged out. Waiting for processes to exit.
Jan 23 10:31:40 compute-1 systemd-logind[807]: Removed session 57.
Jan 23 10:31:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:41.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:41 compute-1 ceph-mon[80126]: pgmap v1157: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:41.715 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:42 compute-1 nova_compute[225705]: 2026-01-23 10:31:42.178 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:43.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:43.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:44 compute-1 ceph-mon[80126]: pgmap v1158: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:45 compute-1 nova_compute[225705]: 2026-01-23 10:31:45.118 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:45.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:45 compute-1 ceph-mon[80126]: pgmap v1159: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:31:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:45.718 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:31:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:47 compute-1 nova_compute[225705]: 2026-01-23 10:31:47.178 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:47.257 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:47.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:47 compute-1 sudo[246687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:31:47 compute-1 sudo[246687]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:47 compute-1 sudo[246687]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:48 compute-1 ceph-mon[80126]: pgmap v1160: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1552987279' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:31:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1552987279' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:31:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:49.261 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:49.723 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:50 compute-1 nova_compute[225705]: 2026-01-23 10:31:50.122 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:50 compute-1 ceph-mon[80126]: pgmap v1161: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:31:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:51.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:51 compute-1 ceph-mon[80126]: pgmap v1162: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:31:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:51.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:51 compute-1 podman[246715]: 2026-01-23 10:31:51.741918802 +0000 UTC m=+0.135660127 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Jan 23 10:31:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:52 compute-1 nova_compute[225705]: 2026-01-23 10:31:52.180 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:31:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:53.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:31:53 compute-1 sudo[246742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:31:53 compute-1 sudo[246742]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:53 compute-1 sudo[246742]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:53 compute-1 sudo[246768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:31:53 compute-1 sudo[246768]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:53.727 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:54 compute-1 sudo[246768]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:54 compute-1 ceph-mon[80126]: pgmap v1163: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:31:54 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:31:54 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:31:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:31:55.063 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:31:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:31:55.063 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:31:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:31:55.064 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:31:55 compute-1 nova_compute[225705]: 2026-01-23 10:31:55.125 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:31:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:31:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:31:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:31:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:31:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:31:55 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:31:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:55.273 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:55.729 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:56 compute-1 ceph-mon[80126]: pgmap v1164: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:31:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:31:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:31:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:31:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:31:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:31:57 compute-1 nova_compute[225705]: 2026-01-23 10:31:57.181 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:31:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:31:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:57.275 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:31:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:57.732 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:58 compute-1 ceph-mon[80126]: pgmap v1165: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 801 B/s rd, 0 op/s
Jan 23 10:31:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:31:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:31:59.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:31:59 compute-1 ceph-mon[80126]: pgmap v1166: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:31:59 compute-1 sudo[246827]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:31:59 compute-1 sudo[246827]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:31:59 compute-1 sudo[246827]: pam_unix(sudo:session): session closed for user root
Jan 23 10:31:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:31:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:31:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:31:59.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:00 compute-1 nova_compute[225705]: 2026-01-23 10:32:00.129 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:00 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:32:00 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:32:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:01.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:01 compute-1 ceph-mon[80126]: pgmap v1167: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:32:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:01.736 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:02 compute-1 nova_compute[225705]: 2026-01-23 10:32:02.185 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #64. Immutable memtables: 0.
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.341299) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 37] Flushing memtable with next log file: 64
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322341426, "job": 37, "event": "flush_started", "num_memtables": 1, "num_entries": 2851, "num_deletes": 506, "total_data_size": 6398009, "memory_usage": 6490344, "flush_reason": "Manual Compaction"}
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 37] Level-0 flush table #65: started
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322360132, "cf_name": "default", "job": 37, "event": "table_file_creation", "file_number": 65, "file_size": 2686587, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 33518, "largest_seqno": 36364, "table_properties": {"data_size": 2676761, "index_size": 5040, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3717, "raw_key_size": 31915, "raw_average_key_size": 21, "raw_value_size": 2652031, "raw_average_value_size": 1807, "num_data_blocks": 215, "num_entries": 1467, "num_filter_entries": 1467, "num_deletions": 506, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164133, "oldest_key_time": 1769164133, "file_creation_time": 1769164322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 65, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 37] Flush lasted 18892 microseconds, and 7774 cpu microseconds.
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.360207) [db/flush_job.cc:967] [default] [JOB 37] Level-0 flush table #65: 2686587 bytes OK
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.360268) [db/memtable_list.cc:519] [default] Level-0 commit table #65 started
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.363096) [db/memtable_list.cc:722] [default] Level-0 commit table #65: memtable #1 done
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.363130) EVENT_LOG_v1 {"time_micros": 1769164322363126, "job": 37, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.363150) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 37] Try to delete WAL files size 6383594, prev total WAL file size 6383594, number of live WAL files 2.
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000061.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.365049) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031303034' seq:72057594037927935, type:22 .. '6D6772737461740031323536' seq:0, type:0; will stop at (end)
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 38] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 37 Base level 0, inputs: [65(2623KB)], [63(13MB)]
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322365244, "job": 38, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [65], "files_L6": [63], "score": -1, "input_data_size": 17294163, "oldest_snapshot_seqno": -1}
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 38] Generated table #66: 6859 keys, 14416841 bytes, temperature: kUnknown
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322499623, "cf_name": "default", "job": 38, "event": "table_file_creation", "file_number": 66, "file_size": 14416841, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14372547, "index_size": 26070, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 177005, "raw_average_key_size": 25, "raw_value_size": 14250687, "raw_average_value_size": 2077, "num_data_blocks": 1044, "num_entries": 6859, "num_filter_entries": 6859, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164322, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 66, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.500211) [db/compaction/compaction_job.cc:1663] [default] [JOB 38] Compacted 1@0 + 1@6 files to L6 => 14416841 bytes
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.503100) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 128.6 rd, 107.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.6, 13.9 +0.0 blob) out(13.7 +0.0 blob), read-write-amplify(11.8) write-amplify(5.4) OK, records in: 7803, records dropped: 944 output_compression: NoCompression
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.503132) EVENT_LOG_v1 {"time_micros": 1769164322503118, "job": 38, "event": "compaction_finished", "compaction_time_micros": 134514, "compaction_time_cpu_micros": 59576, "output_level": 6, "num_output_files": 1, "total_output_size": 14416841, "num_input_records": 7803, "num_output_records": 6859, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000065.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322504332, "job": 38, "event": "table_file_deletion", "file_number": 65}
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000063.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164322509479, "job": 38, "event": "table_file_deletion", "file_number": 63}
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.364801) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509610) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509623) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509626) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:02 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:02.509631) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:03.287 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:03 compute-1 ceph-mon[80126]: pgmap v1168: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:32:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:03.738 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:05 compute-1 nova_compute[225705]: 2026-01-23 10:32:05.173 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:05.292 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:05.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:05 compute-1 ceph-mon[80126]: pgmap v1169: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 534 B/s rd, 0 op/s
Jan 23 10:32:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:32:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:07 compute-1 nova_compute[225705]: 2026-01-23 10:32:07.187 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:07.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:07.742 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:07 compute-1 ceph-mon[80126]: pgmap v1170: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:07 compute-1 sudo[246857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:32:07 compute-1 sudo[246857]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:32:08 compute-1 sudo[246857]: pam_unix(sudo:session): session closed for user root
Jan 23 10:32:08 compute-1 podman[246881]: 2026-01-23 10:32:08.081375111 +0000 UTC m=+0.074633937 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 10:32:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:09.299 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:09.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:09 compute-1 ceph-mon[80126]: pgmap v1171: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:10 compute-1 nova_compute[225705]: 2026-01-23 10:32:10.177 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:11.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:32:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:11.746 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:32:11 compute-1 ceph-mon[80126]: pgmap v1172: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:12 compute-1 nova_compute[225705]: 2026-01-23 10:32:12.045 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:12 compute-1 nova_compute[225705]: 2026-01-23 10:32:12.045 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:32:12 compute-1 nova_compute[225705]: 2026-01-23 10:32:12.045 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:32:12 compute-1 nova_compute[225705]: 2026-01-23 10:32:12.098 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:32:12 compute-1 nova_compute[225705]: 2026-01-23 10:32:12.188 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:13.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:13.747 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:13 compute-1 ceph-mon[80126]: pgmap v1173: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:13 compute-1 nova_compute[225705]: 2026-01-23 10:32:13.923 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:15 compute-1 nova_compute[225705]: 2026-01-23 10:32:15.195 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:15.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:15.749 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:15 compute-1 ceph-mon[80126]: pgmap v1174: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:15 compute-1 nova_compute[225705]: 2026-01-23 10:32:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3115399558' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:16 compute-1 nova_compute[225705]: 2026-01-23 10:32:16.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:16 compute-1 nova_compute[225705]: 2026-01-23 10:32:16.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:16 compute-1 nova_compute[225705]: 2026-01-23 10:32:16.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:16 compute-1 nova_compute[225705]: 2026-01-23 10:32:16.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:32:16 compute-1 nova_compute[225705]: 2026-01-23 10:32:16.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:32:16 compute-1 nova_compute[225705]: 2026-01-23 10:32:16.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:32:16 compute-1 nova_compute[225705]: 2026-01-23 10:32:16.902 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:32:16 compute-1 nova_compute[225705]: 2026-01-23 10:32:16.902 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:32:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:17 compute-1 nova_compute[225705]: 2026-01-23 10:32:17.191 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:17.313 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:17 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:32:17 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1685129570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:17 compute-1 nova_compute[225705]: 2026-01-23 10:32:17.434 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.531s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:32:17 compute-1 nova_compute[225705]: 2026-01-23 10:32:17.604 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:32:17 compute-1 nova_compute[225705]: 2026-01-23 10:32:17.607 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4837MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:32:17 compute-1 nova_compute[225705]: 2026-01-23 10:32:17.607 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:32:17 compute-1 nova_compute[225705]: 2026-01-23 10:32:17.608 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:32:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:17.751 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:17 compute-1 ceph-mon[80126]: pgmap v1175: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3707941147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1685129570' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:17 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3359406760' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #67. Immutable memtables: 0.
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.887292) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 39] Flushing memtable with next log file: 67
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337887332, "job": 39, "event": "flush_started", "num_memtables": 1, "num_entries": 400, "num_deletes": 251, "total_data_size": 456611, "memory_usage": 464888, "flush_reason": "Manual Compaction"}
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 39] Level-0 flush table #68: started
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337891573, "cf_name": "default", "job": 39, "event": "table_file_creation", "file_number": 68, "file_size": 298037, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36369, "largest_seqno": 36764, "table_properties": {"data_size": 295738, "index_size": 463, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 773, "raw_key_size": 5740, "raw_average_key_size": 18, "raw_value_size": 291160, "raw_average_value_size": 948, "num_data_blocks": 20, "num_entries": 307, "num_filter_entries": 307, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164323, "oldest_key_time": 1769164323, "file_creation_time": 1769164337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 68, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 39] Flush lasted 4328 microseconds, and 1505 cpu microseconds.
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.891622) [db/flush_job.cc:967] [default] [JOB 39] Level-0 flush table #68: 298037 bytes OK
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.891644) [db/memtable_list.cc:519] [default] Level-0 commit table #68 started
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893595) [db/memtable_list.cc:722] [default] Level-0 commit table #68: memtable #1 done
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893612) EVENT_LOG_v1 {"time_micros": 1769164337893607, "job": 39, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893634) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 39] Try to delete WAL files size 454028, prev total WAL file size 454028, number of live WAL files 2.
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000064.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.894054) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730032373631' seq:72057594037927935, type:22 .. '7061786F730033303133' seq:0, type:0; will stop at (end)
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 40] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 39 Base level 0, inputs: [68(291KB)], [66(13MB)]
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337894086, "job": 40, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [68], "files_L6": [66], "score": -1, "input_data_size": 14714878, "oldest_snapshot_seqno": -1}
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 40] Generated table #69: 6656 keys, 12554193 bytes, temperature: kUnknown
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337969059, "cf_name": "default", "job": 40, "event": "table_file_creation", "file_number": 69, "file_size": 12554193, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12512675, "index_size": 23806, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16645, "raw_key_size": 173491, "raw_average_key_size": 26, "raw_value_size": 12395764, "raw_average_value_size": 1862, "num_data_blocks": 943, "num_entries": 6656, "num_filter_entries": 6656, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164337, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 69, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.969457) [db/compaction/compaction_job.cc:1663] [default] [JOB 40] Compacted 1@0 + 1@6 files to L6 => 12554193 bytes
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.971213) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 195.9 rd, 167.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.3, 13.7 +0.0 blob) out(12.0 +0.0 blob), read-write-amplify(91.5) write-amplify(42.1) OK, records in: 7166, records dropped: 510 output_compression: NoCompression
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.971233) EVENT_LOG_v1 {"time_micros": 1769164337971223, "job": 40, "event": "compaction_finished", "compaction_time_micros": 75126, "compaction_time_cpu_micros": 28493, "output_level": 6, "num_output_files": 1, "total_output_size": 12554193, "num_input_records": 7166, "num_output_records": 6656, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000068.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337972165, "job": 40, "event": "table_file_deletion", "file_number": 68}
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000066.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164337974850, "job": 40, "event": "table_file_deletion", "file_number": 66}
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.893992) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974959) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974961) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974963) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:17 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:32:17.974964) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:32:18 compute-1 nova_compute[225705]: 2026-01-23 10:32:18.109 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:32:18 compute-1 nova_compute[225705]: 2026-01-23 10:32:18.110 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:32:18 compute-1 nova_compute[225705]: 2026-01-23 10:32:18.241 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:32:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:32:18 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/204262838' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:18 compute-1 nova_compute[225705]: 2026-01-23 10:32:18.721 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:32:18 compute-1 nova_compute[225705]: 2026-01-23 10:32:18.727 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:32:18 compute-1 nova_compute[225705]: 2026-01-23 10:32:18.763 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:32:18 compute-1 nova_compute[225705]: 2026-01-23 10:32:18.765 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:32:18 compute-1 nova_compute[225705]: 2026-01-23 10:32:18.765 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:32:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4119112767' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/204262838' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:32:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:19.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:19.753 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:19 compute-1 nova_compute[225705]: 2026-01-23 10:32:19.766 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:19 compute-1 nova_compute[225705]: 2026-01-23 10:32:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:19 compute-1 nova_compute[225705]: 2026-01-23 10:32:19.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:32:20 compute-1 ceph-mon[80126]: pgmap v1176: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:20 compute-1 nova_compute[225705]: 2026-01-23 10:32:20.235 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:20 compute-1 nova_compute[225705]: 2026-01-23 10:32:20.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:32:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:32:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:21.320 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:21.755 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:22 compute-1 nova_compute[225705]: 2026-01-23 10:32:22.194 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:22 compute-1 ceph-mon[80126]: pgmap v1177: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:22 compute-1 podman[246954]: 2026-01-23 10:32:22.70365427 +0000 UTC m=+0.104362652 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 10:32:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:23.322 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:23.756 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:24 compute-1 ceph-mon[80126]: pgmap v1178: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:25 compute-1 nova_compute[225705]: 2026-01-23 10:32:25.238 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:25.326 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:25 compute-1 ceph-mon[80126]: pgmap v1179: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:25.759 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:27 compute-1 nova_compute[225705]: 2026-01-23 10:32:27.196 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:32:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:27.329 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:32:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:27.760 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:27 compute-1 ceph-mon[80126]: pgmap v1180: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:28 compute-1 sudo[246983]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:32:28 compute-1 sudo[246983]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:32:28 compute-1 sudo[246983]: pam_unix(sudo:session): session closed for user root
Jan 23 10:32:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:29.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:29.764 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:30 compute-1 nova_compute[225705]: 2026-01-23 10:32:30.242 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:31 compute-1 ceph-mon[80126]: pgmap v1181: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:31.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:32:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:31.766 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:32:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:32 compute-1 nova_compute[225705]: 2026-01-23 10:32:32.199 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:32 compute-1 ceph-mon[80126]: pgmap v1182: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:33.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:33.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:34 compute-1 ceph-mon[80126]: pgmap v1183: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:32:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:32:35 compute-1 nova_compute[225705]: 2026-01-23 10:32:35.246 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:35.344 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:35 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000001 to be held by another RGW process; skipping for now
Jan 23 10:32:35 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000003 to be held by another RGW process; skipping for now
Jan 23 10:32:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:35.769 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:36 compute-1 radosgw[83743]: INFO: RGWReshardLock::lock found lock on reshard.0000000011 to be held by another RGW process; skipping for now
Jan 23 10:32:36 compute-1 ceph-mon[80126]: pgmap v1184: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:32:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:37 compute-1 nova_compute[225705]: 2026-01-23 10:32:37.201 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:37.347 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:37.771 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:37 compute-1 ceph-mon[80126]: pgmap v1185: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 12 KiB/s rd, 0 B/s wr, 19 op/s
Jan 23 10:32:38 compute-1 podman[247013]: 2026-01-23 10:32:38.702225413 +0000 UTC m=+0.090974152 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202)
Jan 23 10:32:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:39.350 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:39.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:40 compute-1 nova_compute[225705]: 2026-01-23 10:32:40.276 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:40 compute-1 ceph-mon[80126]: pgmap v1186: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 11 KiB/s rd, 0 B/s wr, 18 op/s
Jan 23 10:32:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:32:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:41.354 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:32:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:41.775 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:42 compute-1 nova_compute[225705]: 2026-01-23 10:32:42.203 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:42 compute-1 ceph-mon[80126]: pgmap v1187: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 11 KiB/s rd, 0 B/s wr, 18 op/s
Jan 23 10:32:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:43.358 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:43.778 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:44 compute-1 ceph-mon[80126]: pgmap v1188: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 41 KiB/s rd, 0 B/s wr, 68 op/s
Jan 23 10:32:45 compute-1 nova_compute[225705]: 2026-01-23 10:32:45.280 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:45.360 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:45.780 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:45 compute-1 ceph-mon[80126]: pgmap v1189: 353 pgs: 353 active+clean; 41 MiB data, 251 MiB used, 60 GiB / 60 GiB avail; 41 KiB/s rd, 0 B/s wr, 68 op/s
Jan 23 10:32:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:47 compute-1 nova_compute[225705]: 2026-01-23 10:32:47.206 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:47.364 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:47.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:48 compute-1 sudo[247037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:32:48 compute-1 sudo[247037]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:32:48 compute-1 sudo[247037]: pam_unix(sudo:session): session closed for user root
Jan 23 10:32:48 compute-1 ceph-mon[80126]: pgmap v1190: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 89 KiB/s rd, 0 B/s wr, 147 op/s
Jan 23 10:32:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:49.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:49 compute-1 ceph-mon[80126]: pgmap v1191: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Jan 23 10:32:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:49.783 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:50 compute-1 nova_compute[225705]: 2026-01-23 10:32:50.283 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:32:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:51.371 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:51 compute-1 ceph-mon[80126]: pgmap v1192: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 78 KiB/s rd, 0 B/s wr, 129 op/s
Jan 23 10:32:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:51.787 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:52 compute-1 nova_compute[225705]: 2026-01-23 10:32:52.208 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:53.375 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:53 compute-1 podman[247065]: 2026-01-23 10:32:53.730297805 +0000 UTC m=+0.132016661 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Jan 23 10:32:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:53.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:54 compute-1 ceph-mon[80126]: pgmap v1193: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 93 KiB/s rd, 0 B/s wr, 155 op/s
Jan 23 10:32:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:32:55.064 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:32:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:32:55.065 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:32:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:32:55.065 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:32:55 compute-1 nova_compute[225705]: 2026-01-23 10:32:55.286 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:55.378 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:55.790 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:56 compute-1 ceph-mon[80126]: pgmap v1194: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 63 KiB/s rd, 0 B/s wr, 105 op/s
Jan 23 10:32:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:32:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:32:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:32:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:32:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:32:57 compute-1 nova_compute[225705]: 2026-01-23 10:32:57.209 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:32:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:32:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:57.381 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:32:57 compute-1 ceph-mon[80126]: pgmap v1195: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 64 KiB/s rd, 0 B/s wr, 105 op/s
Jan 23 10:32:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:57.793 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:32:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:32:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:32:59.384 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:32:59 compute-1 sudo[247095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:32:59 compute-1 sudo[247095]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:32:59 compute-1 sudo[247095]: pam_unix(sudo:session): session closed for user root
Jan 23 10:32:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:32:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:32:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:32:59.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:32:59 compute-1 sudo[247120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:32:59 compute-1 sudo[247120]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:00 compute-1 ceph-mon[80126]: pgmap v1196: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 0 B/s wr, 26 op/s
Jan 23 10:33:00 compute-1 nova_compute[225705]: 2026-01-23 10:33:00.290 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:00 compute-1 sudo[247120]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:33:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:33:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:33:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:33:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:33:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:33:01 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:33:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:01.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:01.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:02 compute-1 ceph-mon[80126]: pgmap v1197: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 16 KiB/s rd, 0 B/s wr, 26 op/s
Jan 23 10:33:02 compute-1 nova_compute[225705]: 2026-01-23 10:33:02.211 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:03.393 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:03.799 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:04 compute-1 ceph-mon[80126]: pgmap v1198: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 17 KiB/s rd, 0 B/s wr, 27 op/s
Jan 23 10:33:05 compute-1 sudo[247178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:33:05 compute-1 sudo[247178]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:05 compute-1 sudo[247178]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:05 compute-1 nova_compute[225705]: 2026-01-23 10:33:05.293 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:05.396 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:05.802 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:05 compute-1 ceph-mon[80126]: pgmap v1199: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 522 B/s rd, 0 op/s
Jan 23 10:33:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:33:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:33:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:33:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:07 compute-1 nova_compute[225705]: 2026-01-23 10:33:07.213 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:07.400 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #70. Immutable memtables: 0.
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.464205) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 41] Flushing memtable with next log file: 70
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387464256, "job": 41, "event": "flush_started", "num_memtables": 1, "num_entries": 743, "num_deletes": 251, "total_data_size": 1499624, "memory_usage": 1527584, "flush_reason": "Manual Compaction"}
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 41] Level-0 flush table #71: started
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387474950, "cf_name": "default", "job": 41, "event": "table_file_creation", "file_number": 71, "file_size": 980518, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 36769, "largest_seqno": 37507, "table_properties": {"data_size": 976921, "index_size": 1441, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 7221, "raw_average_key_size": 17, "raw_value_size": 969770, "raw_average_value_size": 2298, "num_data_blocks": 62, "num_entries": 422, "num_filter_entries": 422, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164338, "oldest_key_time": 1769164338, "file_creation_time": 1769164387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 71, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 41] Flush lasted 10797 microseconds, and 6348 cpu microseconds.
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.475002) [db/flush_job.cc:967] [default] [JOB 41] Level-0 flush table #71: 980518 bytes OK
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.475022) [db/memtable_list.cc:519] [default] Level-0 commit table #71 started
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.477030) [db/memtable_list.cc:722] [default] Level-0 commit table #71: memtable #1 done
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.477083) EVENT_LOG_v1 {"time_micros": 1769164387477069, "job": 41, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.477117) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 41] Try to delete WAL files size 1495708, prev total WAL file size 1495708, number of live WAL files 2.
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000067.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.478365) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B7600323531' seq:72057594037927935, type:22 .. '6B7600353033' seq:0, type:0; will stop at (end)
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 42] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 41 Base level 0, inputs: [71(957KB)], [69(11MB)]
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387478426, "job": 42, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [71], "files_L6": [69], "score": -1, "input_data_size": 13534711, "oldest_snapshot_seqno": -1}
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 42] Generated table #72: 6562 keys, 12131851 bytes, temperature: kUnknown
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387562779, "cf_name": "default", "job": 42, "event": "table_file_creation", "file_number": 72, "file_size": 12131851, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 12090948, "index_size": 23383, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 16453, "raw_key_size": 173211, "raw_average_key_size": 26, "raw_value_size": 11975454, "raw_average_value_size": 1824, "num_data_blocks": 913, "num_entries": 6562, "num_filter_entries": 6562, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164387, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 72, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.563317) [db/compaction/compaction_job.cc:1663] [default] [JOB 42] Compacted 1@0 + 1@6 files to L6 => 12131851 bytes
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.564748) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.3 rd, 143.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.9, 12.0 +0.0 blob) out(11.6 +0.0 blob), read-write-amplify(26.2) write-amplify(12.4) OK, records in: 7078, records dropped: 516 output_compression: NoCompression
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.564772) EVENT_LOG_v1 {"time_micros": 1769164387564761, "job": 42, "event": "compaction_finished", "compaction_time_micros": 84428, "compaction_time_cpu_micros": 34573, "output_level": 6, "num_output_files": 1, "total_output_size": 12131851, "num_input_records": 7078, "num_output_records": 6562, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000071.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387565306, "job": 42, "event": "table_file_deletion", "file_number": 71}
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000069.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164387568605, "job": 42, "event": "table_file_deletion", "file_number": 69}
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.478235) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568685) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568691) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568693) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568695) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:33:07.568697) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:33:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:07.804 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:08 compute-1 ceph-mon[80126]: pgmap v1200: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 783 B/s rd, 0 op/s
Jan 23 10:33:08 compute-1 sudo[247205]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:33:08 compute-1 sudo[247205]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:08 compute-1 sudo[247205]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:09.403 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:09 compute-1 podman[247231]: 2026-01-23 10:33:09.682078877 +0000 UTC m=+0.082104693 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:33:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:09.806 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:10 compute-1 ceph-mon[80126]: pgmap v1201: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 522 B/s rd, 0 op/s
Jan 23 10:33:10 compute-1 nova_compute[225705]: 2026-01-23 10:33:10.297 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:11.407 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:11.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:12 compute-1 nova_compute[225705]: 2026-01-23 10:33:12.214 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:12 compute-1 ceph-mon[80126]: pgmap v1202: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 522 B/s rd, 0 op/s
Jan 23 10:33:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:13.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:13.810 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:13 compute-1 nova_compute[225705]: 2026-01-23 10:33:13.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:13 compute-1 nova_compute[225705]: 2026-01-23 10:33:13.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:33:13 compute-1 nova_compute[225705]: 2026-01-23 10:33:13.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:33:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:13 compute-1 nova_compute[225705]: 2026-01-23 10:33:13.890 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:33:14 compute-1 ceph-mon[80126]: pgmap v1203: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 764 B/s rd, 0 op/s
Jan 23 10:33:14 compute-1 nova_compute[225705]: 2026-01-23 10:33:14.884 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:15 compute-1 nova_compute[225705]: 2026-01-23 10:33:15.324 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:15 compute-1 ceph-mon[80126]: pgmap v1204: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 509 B/s rd, 0 op/s
Jan 23 10:33:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:33:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:15.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:33:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:15.813 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:15 compute-1 nova_compute[225705]: 2026-01-23 10:33:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:17 compute-1 nova_compute[225705]: 2026-01-23 10:33:17.215 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:17.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:17 compute-1 ceph-mon[80126]: pgmap v1205: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 764 B/s rd, 0 op/s
Jan 23 10:33:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:17.815 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:17 compute-1 nova_compute[225705]: 2026-01-23 10:33:17.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:17 compute-1 nova_compute[225705]: 2026-01-23 10:33:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:17 compute-1 nova_compute[225705]: 2026-01-23 10:33:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.030 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.030 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.031 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.031 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.031 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:33:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:33:18 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/944075048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.508 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:33:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1310870223' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/944075048' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:18 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4059943484' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.743 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.746 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4856MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.746 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.747 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.846 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.846 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:33:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:18 compute-1 nova_compute[225705]: 2026-01-23 10:33:18.897 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:33:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:33:19 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2868338412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:19 compute-1 nova_compute[225705]: 2026-01-23 10:33:19.377 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.481s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:33:19 compute-1 nova_compute[225705]: 2026-01-23 10:33:19.384 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:33:19 compute-1 nova_compute[225705]: 2026-01-23 10:33:19.409 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:33:19 compute-1 nova_compute[225705]: 2026-01-23 10:33:19.411 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:33:19 compute-1 nova_compute[225705]: 2026-01-23 10:33:19.412 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.665s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:33:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:19.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:19.817 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:20 compute-1 ceph-mon[80126]: pgmap v1206: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 509 B/s rd, 0 op/s
Jan 23 10:33:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1727738214' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2868338412' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:20 compute-1 nova_compute[225705]: 2026-01-23 10:33:20.358 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:20 compute-1 nova_compute[225705]: 2026-01-23 10:33:20.413 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:20 compute-1 nova_compute[225705]: 2026-01-23 10:33:20.414 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:20 compute-1 nova_compute[225705]: 2026-01-23 10:33:20.414 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:33:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3715483574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:33:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:33:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:21.422 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:21.819 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:21 compute-1 ceph-mon[80126]: pgmap v1207: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 509 B/s rd, 0 op/s
Jan 23 10:33:21 compute-1 nova_compute[225705]: 2026-01-23 10:33:21.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:22 compute-1 nova_compute[225705]: 2026-01-23 10:33:22.218 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:33:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:23.425 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:33:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:23.821 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:23 compute-1 ceph-mon[80126]: pgmap v1208: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 764 B/s rd, 0 op/s
Jan 23 10:33:23 compute-1 nova_compute[225705]: 2026-01-23 10:33:23.869 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:33:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:24 compute-1 podman[247302]: 2026-01-23 10:33:24.682088174 +0000 UTC m=+0.085234011 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Jan 23 10:33:25 compute-1 nova_compute[225705]: 2026-01-23 10:33:25.360 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:25.429 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:25.823 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:25 compute-1 ceph-mon[80126]: pgmap v1209: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:27 compute-1 nova_compute[225705]: 2026-01-23 10:33:27.220 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:27.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:27.825 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:27 compute-1 ceph-mon[80126]: pgmap v1210: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:28 compute-1 sudo[247330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:33:28 compute-1 sudo[247330]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:28 compute-1 sudo[247330]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:29.436 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:29.827 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:30 compute-1 ceph-mon[80126]: pgmap v1211: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:30 compute-1 nova_compute[225705]: 2026-01-23 10:33:30.364 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:31.440 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:31.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:32 compute-1 nova_compute[225705]: 2026-01-23 10:33:32.223 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:32 compute-1 ceph-mon[80126]: pgmap v1212: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:33.444 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:33.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:34 compute-1 ceph-mon[80126]: pgmap v1213: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:35 compute-1 nova_compute[225705]: 2026-01-23 10:33:35.406 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:35.447 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:35 compute-1 ceph-mon[80126]: pgmap v1214: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:33:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:35.834 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:37 compute-1 nova_compute[225705]: 2026-01-23 10:33:37.224 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:37.451 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:37 compute-1 ceph-mon[80126]: pgmap v1215: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:37.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:39.454 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:39 compute-1 ceph-mon[80126]: pgmap v1216: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:39.838 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:40 compute-1 nova_compute[225705]: 2026-01-23 10:33:40.411 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:40 compute-1 podman[247361]: 2026-01-23 10:33:40.644860829 +0000 UTC m=+0.052944175 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:33:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:41.458 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:41.840 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:41 compute-1 ceph-mon[80126]: pgmap v1217: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:42 compute-1 nova_compute[225705]: 2026-01-23 10:33:42.257 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:43.462 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:43.842 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:44 compute-1 ceph-mon[80126]: pgmap v1218: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:45 compute-1 nova_compute[225705]: 2026-01-23 10:33:45.414 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:45.466 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:45.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:46 compute-1 ceph-mon[80126]: pgmap v1219: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:47 compute-1 nova_compute[225705]: 2026-01-23 10:33:47.259 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:47.469 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:47.846 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:47 compute-1 sshd-session[247384]: Invalid user sol from 45.148.10.240 port 32852
Jan 23 10:33:48 compute-1 ceph-mon[80126]: pgmap v1220: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:48 compute-1 sshd-session[247384]: Connection closed by invalid user sol 45.148.10.240 port 32852 [preauth]
Jan 23 10:33:48 compute-1 sudo[247386]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:33:48 compute-1 sudo[247386]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:33:48 compute-1 sudo[247386]: pam_unix(sudo:session): session closed for user root
Jan 23 10:33:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2430308627' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:33:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2430308627' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:33:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:49.471 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:49.848 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:50 compute-1 ceph-mon[80126]: pgmap v1221: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:33:50 compute-1 nova_compute[225705]: 2026-01-23 10:33:50.415 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:51.475 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:51.850 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:52 compute-1 nova_compute[225705]: 2026-01-23 10:33:52.263 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:52 compute-1 ceph-mon[80126]: pgmap v1222: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:53.478 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:53 compute-1 ceph-mon[80126]: pgmap v1223: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:53.853 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:33:55.066 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:33:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:33:55.066 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:33:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:33:55.066 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:33:55 compute-1 nova_compute[225705]: 2026-01-23 10:33:55.418 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:55.481 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:55 compute-1 podman[247415]: 2026-01-23 10:33:55.709455459 +0000 UTC m=+0.115772551 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251202)
Jan 23 10:33:55 compute-1 ceph-mon[80126]: pgmap v1224: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:33:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:55.855 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:33:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:33:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:33:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:33:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:33:57 compute-1 nova_compute[225705]: 2026-01-23 10:33:57.304 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:33:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:57.486 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:33:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:57.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:33:57 compute-1 ceph-mon[80126]: pgmap v1225: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:33:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:33:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:33:59.490 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:33:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:33:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:33:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:33:59.859 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:00 compute-1 ceph-mon[80126]: pgmap v1226: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:00 compute-1 nova_compute[225705]: 2026-01-23 10:34:00.421 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:01.493 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:01 compute-1 ceph-mon[80126]: pgmap v1227: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:01.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:02 compute-1 nova_compute[225705]: 2026-01-23 10:34:02.306 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:34:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:03.498 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:34:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:03.865 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:03 compute-1 ceph-mon[80126]: pgmap v1228: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:05 compute-1 sudo[247445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:34:05 compute-1 sudo[247445]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:05 compute-1 sudo[247445]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:05 compute-1 nova_compute[225705]: 2026-01-23 10:34:05.424 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:05 compute-1 sudo[247470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:34:05 compute-1 sudo[247470]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:05.502 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:05.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:05 compute-1 ceph-mon[80126]: pgmap v1229: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:34:06 compute-1 sudo[247470]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:34:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:34:06 compute-1 ceph-mon[80126]: pgmap v1230: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 794 B/s rd, 0 op/s
Jan 23 10:34:06 compute-1 ceph-mon[80126]: pgmap v1231: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:34:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:34:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:34:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:34:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:34:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:07 compute-1 nova_compute[225705]: 2026-01-23 10:34:07.313 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:07.507 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:07.869 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:08 compute-1 sudo[247529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:34:08 compute-1 sudo[247529]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:08 compute-1 sudo[247529]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:09 compute-1 ceph-mon[80126]: pgmap v1232: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:09.510 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:34:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:09.871 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:34:10 compute-1 ceph-mon[80126]: pgmap v1233: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:10 compute-1 nova_compute[225705]: 2026-01-23 10:34:10.429 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:11 compute-1 sudo[247555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:34:11 compute-1 sudo[247555]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:11 compute-1 sudo[247555]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:11 compute-1 podman[247579]: 2026-01-23 10:34:11.497007685 +0000 UTC m=+0.063332652 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:34:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:34:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:11.514 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:34:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:11.873 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:34:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:34:12 compute-1 nova_compute[225705]: 2026-01-23 10:34:12.319 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:13 compute-1 ceph-mon[80126]: pgmap v1234: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:13.518 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:13 compute-1 nova_compute[225705]: 2026-01-23 10:34:13.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:13 compute-1 nova_compute[225705]: 2026-01-23 10:34:13.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:34:13 compute-1 nova_compute[225705]: 2026-01-23 10:34:13.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:34:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:13.875 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:13 compute-1 nova_compute[225705]: 2026-01-23 10:34:13.902 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:34:15 compute-1 ceph-mon[80126]: pgmap v1235: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 639 B/s rd, 0 op/s
Jan 23 10:34:15 compute-1 nova_compute[225705]: 2026-01-23 10:34:15.433 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:15.521 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:15.877 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:16 compute-1 ceph-mon[80126]: pgmap v1236: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 614 B/s rd, 0 op/s
Jan 23 10:34:16 compute-1 nova_compute[225705]: 2026-01-23 10:34:16.896 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:17 compute-1 nova_compute[225705]: 2026-01-23 10:34:17.324 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:17.526 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:17 compute-1 nova_compute[225705]: 2026-01-23 10:34:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:17 compute-1 nova_compute[225705]: 2026-01-23 10:34:17.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:17.879 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:18 compute-1 nova_compute[225705]: 2026-01-23 10:34:18.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:18 compute-1 nova_compute[225705]: 2026-01-23 10:34:18.928 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:34:18 compute-1 nova_compute[225705]: 2026-01-23 10:34:18.929 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:34:18 compute-1 nova_compute[225705]: 2026-01-23 10:34:18.929 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:34:18 compute-1 nova_compute[225705]: 2026-01-23 10:34:18.929 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:34:18 compute-1 nova_compute[225705]: 2026-01-23 10:34:18.930 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:34:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:34:19 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1917450218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:19 compute-1 nova_compute[225705]: 2026-01-23 10:34:19.392 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:34:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:19.529 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:19 compute-1 ceph-mon[80126]: pgmap v1237: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:19 compute-1 nova_compute[225705]: 2026-01-23 10:34:19.573 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:34:19 compute-1 nova_compute[225705]: 2026-01-23 10:34:19.574 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4861MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:34:19 compute-1 nova_compute[225705]: 2026-01-23 10:34:19.574 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:34:19 compute-1 nova_compute[225705]: 2026-01-23 10:34:19.575 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:34:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:19.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.106 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.107 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.132 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.158 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.159 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.176 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.211 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.228 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.481 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:20 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:34:20 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1049699909' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.735 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.743 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:34:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1917450218' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:34:20 compute-1 ceph-mon[80126]: pgmap v1238: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/740635230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/505590307' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.890 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.892 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:34:20 compute-1 nova_compute[225705]: 2026-01-23 10:34:20.893 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.318s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:34:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:21.532 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:21.884 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:21 compute-1 nova_compute[225705]: 2026-01-23 10:34:21.893 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:21 compute-1 nova_compute[225705]: 2026-01-23 10:34:21.894 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:21 compute-1 nova_compute[225705]: 2026-01-23 10:34:21.894 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:21 compute-1 nova_compute[225705]: 2026-01-23 10:34:21.894 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:34:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:22 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1049699909' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:22 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1536309353' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:22 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1417754147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:34:22 compute-1 nova_compute[225705]: 2026-01-23 10:34:22.327 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:23 compute-1 ceph-mon[80126]: pgmap v1239: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:23.535 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:23 compute-1 nova_compute[225705]: 2026-01-23 10:34:23.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:34:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:34:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:23.885 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:34:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:25 compute-1 ceph-mon[80126]: pgmap v1240: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:25 compute-1 nova_compute[225705]: 2026-01-23 10:34:25.484 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:34:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:25.539 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:34:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:34:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:25.887 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:34:26 compute-1 podman[247651]: 2026-01-23 10:34:26.726587849 +0000 UTC m=+0.109790202 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Jan 23 10:34:26 compute-1 ceph-mon[80126]: pgmap v1241: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:27 compute-1 nova_compute[225705]: 2026-01-23 10:34:27.328 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:27.541 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:27.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:28 compute-1 sudo[247678]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:34:28 compute-1 sudo[247678]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:28 compute-1 sudo[247678]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:29 compute-1 ceph-mon[80126]: pgmap v1242: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:29.545 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:29.892 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:30 compute-1 nova_compute[225705]: 2026-01-23 10:34:30.487 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:30 compute-1 ceph-mon[80126]: pgmap v1243: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:31.548 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:31.894 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:32 compute-1 nova_compute[225705]: 2026-01-23 10:34:32.331 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:32 compute-1 ceph-mon[80126]: pgmap v1244: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:33.552 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:33.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:35 compute-1 ceph-mon[80126]: pgmap v1245: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:35 compute-1 nova_compute[225705]: 2026-01-23 10:34:35.491 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:34:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:35.557 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:34:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:35.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:34:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:37 compute-1 ceph-mon[80126]: pgmap v1246: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:37 compute-1 nova_compute[225705]: 2026-01-23 10:34:37.374 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:37.561 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:37.902 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:38 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:34:38 compute-1 rsyslogd[1006]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Jan 23 10:34:39 compute-1 ceph-mon[80126]: pgmap v1247: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:39.565 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:39.904 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:40 compute-1 nova_compute[225705]: 2026-01-23 10:34:40.495 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:40 compute-1 ceph-mon[80126]: pgmap v1248: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:41.569 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:41 compute-1 podman[247711]: 2026-01-23 10:34:41.660940103 +0000 UTC m=+0.062141585 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:34:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:41.905 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:42 compute-1 nova_compute[225705]: 2026-01-23 10:34:42.376 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:43 compute-1 ceph-mon[80126]: pgmap v1249: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:43.573 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:43.908 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:45 compute-1 ceph-mon[80126]: pgmap v1250: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:45 compute-1 nova_compute[225705]: 2026-01-23 10:34:45.497 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:45.576 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:45.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:47 compute-1 ceph-mon[80126]: pgmap v1251: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:47 compute-1 nova_compute[225705]: 2026-01-23 10:34:47.377 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:47.579 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:47.912 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:48 compute-1 sudo[247734]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:34:48 compute-1 sudo[247734]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:34:48 compute-1 sudo[247734]: pam_unix(sudo:session): session closed for user root
Jan 23 10:34:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:49 compute-1 ceph-mon[80126]: pgmap v1252: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/4134536473' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:34:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/4134536473' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:34:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:49.582 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:49.915 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:34:50 compute-1 nova_compute[225705]: 2026-01-23 10:34:50.501 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:51 compute-1 ceph-mon[80126]: pgmap v1253: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:51.585 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:51.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:52 compute-1 nova_compute[225705]: 2026-01-23 10:34:52.378 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:53 compute-1 ceph-mon[80126]: pgmap v1254: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:53.589 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:53.919 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:34:55.067 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:34:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:34:55.068 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:34:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:34:55.068 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:34:55 compute-1 ceph-mon[80126]: pgmap v1255: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:55 compute-1 nova_compute[225705]: 2026-01-23 10:34:55.505 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:55.593 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:55.921 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:34:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:34:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:34:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:34:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:34:57 compute-1 nova_compute[225705]: 2026-01-23 10:34:57.380 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:34:57 compute-1 ceph-mon[80126]: pgmap v1256: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:34:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:34:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:57.596 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:34:57 compute-1 podman[247764]: 2026-01-23 10:34:57.714607327 +0000 UTC m=+0.116012069 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 10:34:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:34:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:57.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:34:58 compute-1 ceph-mon[80126]: pgmap v1257: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:34:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:34:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:34:59.600 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:34:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:34:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:34:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:34:59.926 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:00 compute-1 nova_compute[225705]: 2026-01-23 10:35:00.507 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:01 compute-1 ceph-mon[80126]: pgmap v1258: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:01.603 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:01.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:02 compute-1 nova_compute[225705]: 2026-01-23 10:35:02.381 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:02 compute-1 ceph-mon[80126]: pgmap v1259: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:03.607 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:03.934 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:04 compute-1 ceph-mon[80126]: pgmap v1260: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:05 compute-1 nova_compute[225705]: 2026-01-23 10:35:05.510 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:05.610 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:35:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:05.936 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:06 compute-1 ceph-mon[80126]: pgmap v1261: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:07 compute-1 nova_compute[225705]: 2026-01-23 10:35:07.383 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:07.615 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:07.940 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:08 compute-1 sudo[247796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:35:08 compute-1 sudo[247796]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:08 compute-1 sudo[247796]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:09 compute-1 ceph-mon[80126]: pgmap v1262: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:09.618 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:09.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:10 compute-1 ceph-mon[80126]: pgmap v1263: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:10 compute-1 nova_compute[225705]: 2026-01-23 10:35:10.528 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:11 compute-1 sudo[247823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:35:11 compute-1 sudo[247823]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:11 compute-1 sudo[247823]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:11.621 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:11 compute-1 sudo[247848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:35:11 compute-1 sudo[247848]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:11 compute-1 podman[247872]: 2026-01-23 10:35:11.763389873 +0000 UTC m=+0.054547145 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 10:35:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:11.946 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:12 compute-1 sudo[247848]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:12 compute-1 nova_compute[225705]: 2026-01-23 10:35:12.385 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-0", "name": "osd_memory_target"}]: dispatch
Jan 23 10:35:12 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-1", "name": "osd_memory_target"}]: dispatch
Jan 23 10:35:13 compute-1 ceph-mon[80126]: pgmap v1264: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:13.624 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:13 compute-1 nova_compute[225705]: 2026-01-23 10:35:13.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:13 compute-1 nova_compute[225705]: 2026-01-23 10:35:13.877 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:35:13 compute-1 nova_compute[225705]: 2026-01-23 10:35:13.877 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:35:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:13.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:14 compute-1 nova_compute[225705]: 2026-01-23 10:35:14.536 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:35:15 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:15 compute-1 ceph-mon[80126]: pgmap v1265: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:15 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:15 compute-1 nova_compute[225705]: 2026-01-23 10:35:15.532 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:15.628 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:15.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config rm", "who": "osd/host:compute-2", "name": "osd_memory_target"}]: dispatch
Jan 23 10:35:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:35:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:35:17 compute-1 ceph-mon[80126]: pgmap v1266: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:35:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:35:17 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:35:17 compute-1 nova_compute[225705]: 2026-01-23 10:35:17.388 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:17 compute-1 nova_compute[225705]: 2026-01-23 10:35:17.529 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:17.632 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:17.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:18 compute-1 ceph-mon[80126]: pgmap v1267: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:18 compute-1 nova_compute[225705]: 2026-01-23 10:35:18.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:19.635 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:19 compute-1 nova_compute[225705]: 2026-01-23 10:35:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:19 compute-1 nova_compute[225705]: 2026-01-23 10:35:19.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:19 compute-1 nova_compute[225705]: 2026-01-23 10:35:19.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:19 compute-1 nova_compute[225705]: 2026-01-23 10:35:19.924 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:35:19 compute-1 nova_compute[225705]: 2026-01-23 10:35:19.925 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:35:19 compute-1 nova_compute[225705]: 2026-01-23 10:35:19.925 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:35:19 compute-1 nova_compute[225705]: 2026-01-23 10:35:19.925 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:35:19 compute-1 nova_compute[225705]: 2026-01-23 10:35:19.926 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:35:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:19.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:20 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:35:20 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/905573868' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:20 compute-1 nova_compute[225705]: 2026-01-23 10:35:20.414 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:35:20 compute-1 nova_compute[225705]: 2026-01-23 10:35:20.535 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:20 compute-1 nova_compute[225705]: 2026-01-23 10:35:20.620 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:35:20 compute-1 nova_compute[225705]: 2026-01-23 10:35:20.622 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4854MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:35:20 compute-1 nova_compute[225705]: 2026-01-23 10:35:20.623 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:35:20 compute-1 nova_compute[225705]: 2026-01-23 10:35:20.624 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:35:20 compute-1 ceph-mon[80126]: pgmap v1268: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:35:20 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/905573868' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:20 compute-1 nova_compute[225705]: 2026-01-23 10:35:20.853 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:35:20 compute-1 nova_compute[225705]: 2026-01-23 10:35:20.854 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:35:20 compute-1 nova_compute[225705]: 2026-01-23 10:35:20.875 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:35:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:35:21 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/573725531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:21 compute-1 nova_compute[225705]: 2026-01-23 10:35:21.374 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.499s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:35:21 compute-1 nova_compute[225705]: 2026-01-23 10:35:21.384 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:35:21 compute-1 nova_compute[225705]: 2026-01-23 10:35:21.464 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:35:21 compute-1 nova_compute[225705]: 2026-01-23 10:35:21.467 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:35:21 compute-1 nova_compute[225705]: 2026-01-23 10:35:21.468 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.844s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:35:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:21.638 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:21 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/573725531' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:21.961 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:22 compute-1 nova_compute[225705]: 2026-01-23 10:35:22.391 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:23 compute-1 ceph-mon[80126]: pgmap v1269: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 819 B/s rd, 0 op/s
Jan 23 10:35:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3397794839' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1836247010' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:23 compute-1 sudo[247973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:35:23 compute-1 sudo[247973]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:23 compute-1 sudo[247973]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:23 compute-1 nova_compute[225705]: 2026-01-23 10:35:23.468 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:23 compute-1 nova_compute[225705]: 2026-01-23 10:35:23.469 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:23 compute-1 nova_compute[225705]: 2026-01-23 10:35:23.469 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:35:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:23.642 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:23 compute-1 nova_compute[225705]: 2026-01-23 10:35:23.869 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:23.963 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:24 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/342726147' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:24 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:35:24 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3852357449' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:35:25 compute-1 ceph-mon[80126]: pgmap v1270: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:25 compute-1 nova_compute[225705]: 2026-01-23 10:35:25.539 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:25.645 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:25 compute-1 nova_compute[225705]: 2026-01-23 10:35:25.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:35:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:25.966 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:26 compute-1 ceph-mon[80126]: pgmap v1271: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 546 B/s rd, 0 op/s
Jan 23 10:35:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:27 compute-1 nova_compute[225705]: 2026-01-23 10:35:27.393 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:27.649 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:27.968 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:28 compute-1 podman[248001]: 2026-01-23 10:35:28.732061928 +0000 UTC m=+0.123939487 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Jan 23 10:35:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:29 compute-1 sudo[248027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:35:29 compute-1 sudo[248027]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:29 compute-1 sudo[248027]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:29 compute-1 ceph-mon[80126]: pgmap v1272: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:29 compute-1 nova_compute[225705]: 2026-01-23 10:35:29.580 225709 DEBUG oslo_concurrency.processutils [None req-a8510dbf-d677-4163-b681-0279df98cd8c 00aca23f964f49a5a9abfea9744e871b 5220cd4f58cb43bb899e367e961bc5c1 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:35:29 compute-1 nova_compute[225705]: 2026-01-23 10:35:29.614 225709 DEBUG oslo_concurrency.processutils [None req-a8510dbf-d677-4163-b681-0279df98cd8c 00aca23f964f49a5a9abfea9744e871b 5220cd4f58cb43bb899e367e961bc5c1 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:35:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:29.652 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:29.971 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:30 compute-1 nova_compute[225705]: 2026-01-23 10:35:30.543 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:31 compute-1 ceph-mon[80126]: pgmap v1273: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:31.656 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:31.974 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:32 compute-1 nova_compute[225705]: 2026-01-23 10:35:32.415 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:32 compute-1 ceph-mon[80126]: pgmap v1274: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:35:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:33.659 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:35:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:33.977 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:35 compute-1 ceph-mon[80126]: pgmap v1275: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:35 compute-1 nova_compute[225705]: 2026-01-23 10:35:35.545 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:35:35.629 143098 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': '76:02:20', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '3a:46:f9:a0:85:06'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Jan 23 10:35:35 compute-1 nova_compute[225705]: 2026-01-23 10:35:35.630 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:35 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:35:35.631 143098 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Jan 23 10:35:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:35.662 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:35:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:35.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:35:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:35:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:37 compute-1 ceph-mon[80126]: pgmap v1276: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:37 compute-1 nova_compute[225705]: 2026-01-23 10:35:37.418 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:37.666 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:37.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:38 compute-1 ceph-mon[80126]: pgmap v1277: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:39.670 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:39.987 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:40 compute-1 ceph-mon[80126]: pgmap v1278: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:40 compute-1 nova_compute[225705]: 2026-01-23 10:35:40.547 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:40 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:35:40.634 143098 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=170ec811-bf2b-4b3a-9339-50a49c79a1e6, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Jan 23 10:35:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:35:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:41.674 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:35:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:41.990 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:42 compute-1 nova_compute[225705]: 2026-01-23 10:35:42.420 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:42 compute-1 podman[248060]: 2026-01-23 10:35:42.653338549 +0000 UTC m=+0.059426050 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Jan 23 10:35:42 compute-1 ceph-mon[80126]: pgmap v1279: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:43.678 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:43.993 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:44 compute-1 ceph-mon[80126]: pgmap v1280: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:45 compute-1 nova_compute[225705]: 2026-01-23 10:35:45.550 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:45.681 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:45.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:47 compute-1 ceph-mon[80126]: pgmap v1281: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:47 compute-1 nova_compute[225705]: 2026-01-23 10:35:47.423 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:47.685 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:47.999 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:49 compute-1 sudo[248082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:35:49 compute-1 sudo[248082]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:35:49 compute-1 sudo[248082]: pam_unix(sudo:session): session closed for user root
Jan 23 10:35:49 compute-1 ceph-mon[80126]: pgmap v1282: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1762354554' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:35:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1762354554' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:35:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:49.688 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:50.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:50 compute-1 ceph-mon[80126]: pgmap v1283: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:35:50 compute-1 nova_compute[225705]: 2026-01-23 10:35:50.554 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:51.691 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:35:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:52.011 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:35:52 compute-1 nova_compute[225705]: 2026-01-23 10:35:52.463 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:53 compute-1 ceph-mon[80126]: pgmap v1284: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:53.693 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:54.014 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:54 compute-1 ceph-mon[80126]: pgmap v1285: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:54 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #73. Immutable memtables: 0.
Jan 23 10:35:54 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:54.992564) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:35:54 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 43] Flushing memtable with next log file: 73
Jan 23 10:35:54 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164554992603, "job": 43, "event": "flush_started", "num_memtables": 1, "num_entries": 1869, "num_deletes": 251, "total_data_size": 4935419, "memory_usage": 5020464, "flush_reason": "Manual Compaction"}
Jan 23 10:35:54 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 43] Level-0 flush table #74: started
Jan 23 10:35:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:35:55.068 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:35:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:35:55.068 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:35:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:35:55.069 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555362000, "cf_name": "default", "job": 43, "event": "table_file_creation", "file_number": 74, "file_size": 3204080, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 37512, "largest_seqno": 39376, "table_properties": {"data_size": 3196224, "index_size": 4735, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16131, "raw_average_key_size": 20, "raw_value_size": 3180644, "raw_average_value_size": 3995, "num_data_blocks": 201, "num_entries": 796, "num_filter_entries": 796, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164388, "oldest_key_time": 1769164388, "file_creation_time": 1769164554, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 74, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 43] Flush lasted 369483 microseconds, and 6350 cpu microseconds.
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:35:55 compute-1 nova_compute[225705]: 2026-01-23 10:35:55.557 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.362046) [db/flush_job.cc:967] [default] [JOB 43] Level-0 flush table #74: 3204080 bytes OK
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.362068) [db/memtable_list.cc:519] [default] Level-0 commit table #74 started
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.681309) [db/memtable_list.cc:722] [default] Level-0 commit table #74: memtable #1 done
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.681374) EVENT_LOG_v1 {"time_micros": 1769164555681361, "job": 43, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.681406) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 43] Try to delete WAL files size 4927089, prev total WAL file size 4927370, number of live WAL files 2.
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000070.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.683586) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033303132' seq:72057594037927935, type:22 .. '7061786F730033323634' seq:0, type:0; will stop at (end)
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 44] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 43 Base level 0, inputs: [74(3128KB)], [72(11MB)]
Jan 23 10:35:55 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164555683677, "job": 44, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [74], "files_L6": [72], "score": -1, "input_data_size": 15335931, "oldest_snapshot_seqno": -1}
Jan 23 10:35:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:55.696 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:56.018 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 44] Generated table #75: 6840 keys, 13115150 bytes, temperature: kUnknown
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164556443953, "cf_name": "default", "job": 44, "event": "table_file_creation", "file_number": 75, "file_size": 13115150, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 13071551, "index_size": 25375, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17157, "raw_key_size": 179722, "raw_average_key_size": 26, "raw_value_size": 12950149, "raw_average_value_size": 1893, "num_data_blocks": 991, "num_entries": 6840, "num_filter_entries": 6840, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164555, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 75, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.444406) [db/compaction/compaction_job.cc:1663] [default] [JOB 44] Compacted 1@0 + 1@6 files to L6 => 13115150 bytes
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.550432) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 20.2 rd, 17.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.1, 11.6 +0.0 blob) out(12.5 +0.0 blob), read-write-amplify(8.9) write-amplify(4.1) OK, records in: 7358, records dropped: 518 output_compression: NoCompression
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.550528) EVENT_LOG_v1 {"time_micros": 1769164556550464, "job": 44, "event": "compaction_finished", "compaction_time_micros": 760396, "compaction_time_cpu_micros": 55113, "output_level": 6, "num_output_files": 1, "total_output_size": 13115150, "num_input_records": 7358, "num_output_records": 6840, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000074.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164556552228, "job": 44, "event": "table_file_deletion", "file_number": 74}
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000072.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164556556941, "job": 44, "event": "table_file_deletion", "file_number": 72}
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:55.683417) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557142) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557154) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557157) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:56 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:35:56.557164) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:35:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:35:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:35:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:35:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:35:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:35:57 compute-1 ceph-mon[80126]: pgmap v1286: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:35:57 compute-1 nova_compute[225705]: 2026-01-23 10:35:57.467 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:35:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:57.700 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:35:58.020 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:58 compute-1 ceph-mon[80126]: pgmap v1287: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:35:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:35:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:35:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:35:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:35:59.704 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:35:59 compute-1 podman[248113]: 2026-01-23 10:35:59.753024444 +0000 UTC m=+0.151870787 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Jan 23 10:36:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:00.023 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:00 compute-1 nova_compute[225705]: 2026-01-23 10:36:00.560 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:01 compute-1 ceph-mon[80126]: pgmap v1288: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:36:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:01.707 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:36:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:02.027 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:02 compute-1 nova_compute[225705]: 2026-01-23 10:36:02.469 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:02 compute-1 ceph-mon[80126]: pgmap v1289: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:03 compute-1 sshd-session[248139]: Invalid user sol from 45.148.10.240 port 41676
Jan 23 10:36:03 compute-1 sshd-session[248139]: Connection closed by invalid user sol 45.148.10.240 port 41676 [preauth]
Jan 23 10:36:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:03.712 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:04.030 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:05 compute-1 ceph-mon[80126]: pgmap v1290: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:36:05 compute-1 nova_compute[225705]: 2026-01-23 10:36:05.563 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:05.716 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:06.033 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:06 compute-1 ceph-mon[80126]: pgmap v1291: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:07 compute-1 nova_compute[225705]: 2026-01-23 10:36:07.471 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:07.720 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:08.036 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:09 compute-1 ceph-mon[80126]: pgmap v1292: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:09 compute-1 sudo[248144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:36:09 compute-1 sudo[248144]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:09 compute-1 sudo[248144]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:09.725 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:10.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:10 compute-1 nova_compute[225705]: 2026-01-23 10:36:10.572 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:11 compute-1 ceph-mon[80126]: pgmap v1293: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:11.728 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:12.043 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:12 compute-1 nova_compute[225705]: 2026-01-23 10:36:12.473 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:12 compute-1 ceph-mon[80126]: pgmap v1294: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:13 compute-1 podman[248172]: 2026-01-23 10:36:13.669568874 +0000 UTC m=+0.074522555 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 10:36:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:13.731 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:14.046 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:14 compute-1 nova_compute[225705]: 2026-01-23 10:36:14.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:14 compute-1 nova_compute[225705]: 2026-01-23 10:36:14.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:36:14 compute-1 nova_compute[225705]: 2026-01-23 10:36:14.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:36:14 compute-1 ceph-mon[80126]: pgmap v1295: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:14 compute-1 nova_compute[225705]: 2026-01-23 10:36:14.978 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:36:14 compute-1 nova_compute[225705]: 2026-01-23 10:36:14.979 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:14 compute-1 nova_compute[225705]: 2026-01-23 10:36:14.979 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Jan 23 10:36:15 compute-1 nova_compute[225705]: 2026-01-23 10:36:15.578 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:15.734 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:16.048 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:17 compute-1 nova_compute[225705]: 2026-01-23 10:36:17.135 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:17 compute-1 ceph-mon[80126]: pgmap v1296: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:17 compute-1 nova_compute[225705]: 2026-01-23 10:36:17.473 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:17.737 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:18.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:18 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:19 compute-1 ceph-mon[80126]: pgmap v1297: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:19.740 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:19 compute-1 nova_compute[225705]: 2026-01-23 10:36:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:19 compute-1 nova_compute[225705]: 2026-01-23 10:36:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:36:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:20.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.209 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.210 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.210 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.211 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.211 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:36:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.582 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:20 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:36:20 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1725171694' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.683 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.471s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.854 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.855 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4854MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.855 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:36:20 compute-1 nova_compute[225705]: 2026-01-23 10:36:20.856 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.109 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.110 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.155 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:36:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:36:21 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1801997280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.620 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.629 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:36:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:36:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:21.744 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:36:21 compute-1 ceph-mon[80126]: pgmap v1298: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:21 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1725171694' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.838 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.841 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.841 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.985s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.842 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:21 compute-1 nova_compute[225705]: 2026-01-23 10:36:21.842 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Jan 23 10:36:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:22.057 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:22 compute-1 nova_compute[225705]: 2026-01-23 10:36:22.476 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:22 compute-1 nova_compute[225705]: 2026-01-23 10:36:22.661 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Jan 23 10:36:22 compute-1 ceph-mon[80126]: pgmap v1299: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:22 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1801997280' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:23 compute-1 sudo[248240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:36:23 compute-1 sudo[248240]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:23 compute-1 sudo[248240]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:23 compute-1 sudo[248265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --image quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec --timeout 895 ls
Jan 23 10:36:23 compute-1 sudo[248265]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:23 compute-1 nova_compute[225705]: 2026-01-23 10:36:23.661 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:23 compute-1 nova_compute[225705]: 2026-01-23 10:36:23.662 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:23 compute-1 nova_compute[225705]: 2026-01-23 10:36:23.662 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:23.748 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:23 compute-1 nova_compute[225705]: 2026-01-23 10:36:23.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:23 compute-1 nova_compute[225705]: 2026-01-23 10:36:23.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:36:23 compute-1 nova_compute[225705]: 2026-01-23 10:36:23.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:24.058 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:24 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4289693492' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:24 compute-1 podman[248362]: 2026-01-23 10:36:24.281495924 +0000 UTC m=+0.222571778 container exec 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, CEPH_REF=squid, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:36:24 compute-1 podman[248362]: 2026-01-23 10:36:24.450666143 +0000 UTC m=+0.391741697 container exec_died 0d5b0e98337a3464d5d67b786921b07fb4daad264340b0ed4b4b9c650e10df0f (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-crash-compute-1, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.40.1, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph.git)
Jan 23 10:36:25 compute-1 podman[248481]: 2026-01-23 10:36:25.051930778 +0000 UTC m=+0.080873334 container exec 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:36:25 compute-1 podman[248505]: 2026-01-23 10:36:25.130713945 +0000 UTC m=+0.058783029 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:36:25 compute-1 podman[248481]: 2026-01-23 10:36:25.149881187 +0000 UTC m=+0.178823733 container exec_died 743c78689af2fa738ca9d8f1f03edbafea47d3f20823b9c860cb88e378ff4d78 (image=quay.io/prometheus/node-exporter:v1.7.0, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-node-exporter-compute-1, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Jan 23 10:36:25 compute-1 ceph-mon[80126]: pgmap v1300: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:25 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1368383346' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:25 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1080446153' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:25 compute-1 podman[248567]: 2026-01-23 10:36:25.544241075 +0000 UTC m=+0.071162248 container exec 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.40.1, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20250325, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:36:25 compute-1 podman[248567]: 2026-01-23 10:36:25.564074098 +0000 UTC m=+0.090995271 container exec_died 4802bf5f4f2f0aa30683bbd62a63096234d62b55a3aaded8f68f472c524572d6 (image=quay.io/ceph/ceph@sha256:7c69e59beaeea61ca714e71cb84ff6d5e533db7f1fd84143dd9ba6649a5fd2ec, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_SHA1=c92aebb279828e9c3c1f5d24613efca272649e62, io.buildah.version=1.40.1, CEPH_GIT_REPO=https://github.com/ceph/ceph.git, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team <ceph-maintainers@ceph.io>, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20250325, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/)
Jan 23 10:36:25 compute-1 nova_compute[225705]: 2026-01-23 10:36:25.584 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:25.752 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:25 compute-1 podman[248634]: 2026-01-23 10:36:25.960836783 +0000 UTC m=+0.208510166 container exec e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 10:36:25 compute-1 podman[248634]: 2026-01-23 10:36:25.976158045 +0000 UTC m=+0.223831388 container exec_died e3d04ed9319c2edb53493ce17b5a91ba93e956270f16bf21c433dfb4dbc8d75f (image=quay.io/ceph/haproxy:2.3, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-haproxy-nfs-cephfs-compute-1-mnxlgm)
Jan 23 10:36:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:26.060 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:26 compute-1 nova_compute[225705]: 2026-01-23 10:36:26.165 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:36:26 compute-1 podman[248700]: 2026-01-23 10:36:26.269671903 +0000 UTC m=+0.063005402 container exec 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, io.buildah.version=1.28.2, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Keepalived on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1793, name=keepalived, com.redhat.component=keepalived-container, summary=Provides keepalived on RHEL 9 for Ceph., version=2.2.4, description=keepalived for Ceph, io.openshift.tags=Ceph keepalived, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, build-date=2023-02-22T09:23:20)
Jan 23 10:36:26 compute-1 podman[248720]: 2026-01-23 10:36:26.465751069 +0000 UTC m=+0.169287165 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, description=keepalived for Ceph, release=1793, io.buildah.version=1.28.2, com.redhat.component=keepalived-container, vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vendor=Red Hat, Inc., version=2.2.4, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, architecture=x86_64, summary=Provides keepalived on RHEL 9 for Ceph., io.openshift.tags=Ceph keepalived, build-date=2023-02-22T09:23:20, io.openshift.expose-services=)
Jan 23 10:36:26 compute-1 ceph-mon[80126]: pgmap v1301: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:26 compute-1 podman[248700]: 2026-01-23 10:36:26.607311299 +0000 UTC m=+0.400644818 container exec_died 842a2a104e36e60aaf1951949414b91328abaef42ae5fb73c87a7a44cbfac9af (image=quay.io/ceph/keepalived:2.2.4, name=ceph-f3005f84-239a-55b6-a948-8f1fb592b920-keepalived-nfs-cephfs-compute-1-vcrquf, version=2.2.4, description=keepalived for Ceph, release=1793, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides keepalived on RHEL 9 for Ceph., vcs-ref=befaf1f5ec7b874aef2651ee1384d51828504eb9, vcs-type=git, build-date=2023-02-22T09:23:20, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.28.2, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Keepalived on RHEL 9, name=keepalived, io.openshift.tags=Ceph keepalived, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://access.redhat.com/containers/#/registry.access.redhat.com/ubi9-minimal/images/9.1.0-1793, com.redhat.component=keepalived-container, architecture=x86_64)
Jan 23 10:36:26 compute-1 sudo[248265]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:27 compute-1 sudo[248733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:36:27 compute-1 sudo[248733]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:27 compute-1 sudo[248733]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:27 compute-1 sudo[248758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:36:27 compute-1 sudo[248758]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:27 compute-1 nova_compute[225705]: 2026-01-23 10:36:27.477 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:27 compute-1 sudo[248758]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1659114460' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:36:27 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:27 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:27.754 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:28.062 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:28 compute-1 ceph-mon[80126]: pgmap v1302: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:36:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:36:28 compute-1 ceph-mon[80126]: pgmap v1303: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:36:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:36:28 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:36:28 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:29 compute-1 sudo[248815]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:36:29 compute-1 sudo[248815]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:29 compute-1 sudo[248815]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:29.757 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:30.065 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:30 compute-1 nova_compute[225705]: 2026-01-23 10:36:30.587 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:30 compute-1 podman[248841]: 2026-01-23 10:36:30.760964976 +0000 UTC m=+0.142224424 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:36:30 compute-1 ceph-mon[80126]: pgmap v1304: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:31.761 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:32.068 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:32 compute-1 nova_compute[225705]: 2026-01-23 10:36:32.480 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:33 compute-1 ceph-mon[80126]: pgmap v1305: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:33.765 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:33 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:34.070 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:34 compute-1 ceph-mon[80126]: pgmap v1306: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:34 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:34 compute-1 sudo[248871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:36:34 compute-1 sudo[248871]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:34 compute-1 sudo[248871]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:35 compute-1 nova_compute[225705]: 2026-01-23 10:36:35.591 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:35.768 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:36:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:36:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:36.073 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:36 compute-1 ceph-mon[80126]: pgmap v1307: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:37 compute-1 nova_compute[225705]: 2026-01-23 10:36:37.484 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:37.773 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:38.076 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:38 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:39 compute-1 ceph-mon[80126]: pgmap v1308: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 601 B/s rd, 0 op/s
Jan 23 10:36:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:39.777 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:40.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:40 compute-1 nova_compute[225705]: 2026-01-23 10:36:40.594 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:41 compute-1 ceph-mon[80126]: pgmap v1309: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:41.781 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:42.083 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:42 compute-1 nova_compute[225705]: 2026-01-23 10:36:42.484 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:43 compute-1 ceph-mon[80126]: pgmap v1310: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:43.784 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:43 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:44.086 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:44 compute-1 podman[248901]: 2026-01-23 10:36:44.647275374 +0000 UTC m=+0.052377179 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Jan 23 10:36:45 compute-1 ceph-mon[80126]: pgmap v1311: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:45 compute-1 nova_compute[225705]: 2026-01-23 10:36:45.599 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:36:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:45.788 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:36:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:46.088 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:47 compute-1 ceph-mon[80126]: pgmap v1312: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:47 compute-1 nova_compute[225705]: 2026-01-23 10:36:47.486 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:47.791 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:48.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:48 compute-1 ceph-mon[80126]: pgmap v1313: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:48 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2195883716' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:36:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/2195883716' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:36:49 compute-1 sudo[248922]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:36:49 compute-1 sudo[248922]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:36:49 compute-1 sudo[248922]: pam_unix(sudo:session): session closed for user root
Jan 23 10:36:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:49.794 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:50.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:50 compute-1 ceph-mon[80126]: pgmap v1314: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:36:50 compute-1 nova_compute[225705]: 2026-01-23 10:36:50.602 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:51.798 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:52.096 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:36:52 compute-1 nova_compute[225705]: 2026-01-23 10:36:52.489 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:52 compute-1 ceph-mon[80126]: pgmap v1315: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:36:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:53.801 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:36:53 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:54.100 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:54 compute-1 ceph-mon[80126]: pgmap v1316: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:36:55.070 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:36:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:36:55.070 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:36:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:36:55.070 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:36:55 compute-1 nova_compute[225705]: 2026-01-23 10:36:55.607 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:55.805 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:36:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:56.102 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:36:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:36:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:36:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:36:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:36:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:36:57 compute-1 ceph-mon[80126]: pgmap v1317: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:36:57 compute-1 nova_compute[225705]: 2026-01-23 10:36:57.492 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:36:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:57.808 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:36:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:36:58.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:36:58 compute-1 ceph-mon[80126]: pgmap v1318: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:36:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:36:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:36:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:36:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:36:59.811 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:00.106 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:00 compute-1 nova_compute[225705]: 2026-01-23 10:37:00.610 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:00 compute-1 ceph-mon[80126]: pgmap v1319: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:01 compute-1 podman[248955]: 2026-01-23 10:37:01.75596955 +0000 UTC m=+0.155487999 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Jan 23 10:37:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:01.814 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:02.109 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:02 compute-1 nova_compute[225705]: 2026-01-23 10:37:02.494 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:03 compute-1 ceph-mon[80126]: pgmap v1320: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:37:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:03.818 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:37:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:04.112 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:05 compute-1 ceph-mon[80126]: pgmap v1321: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:37:05 compute-1 nova_compute[225705]: 2026-01-23 10:37:05.614 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:05.822 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:06.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:06 compute-1 ceph-mon[80126]: pgmap v1322: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:07 compute-1 nova_compute[225705]: 2026-01-23 10:37:07.496 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:07.826 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:08.118 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:09 compute-1 ceph-mon[80126]: pgmap v1323: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:09 compute-1 sudo[248984]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:37:09 compute-1 sudo[248984]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:09 compute-1 sudo[248984]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:09.829 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:10.121 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:10 compute-1 nova_compute[225705]: 2026-01-23 10:37:10.618 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:10 compute-1 ceph-mon[80126]: pgmap v1324: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:11.832 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:12.125 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:12 compute-1 nova_compute[225705]: 2026-01-23 10:37:12.497 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:13 compute-1 ceph-mon[80126]: pgmap v1325: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:13.836 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:14.128 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:14 compute-1 ceph-mon[80126]: pgmap v1326: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:15 compute-1 nova_compute[225705]: 2026-01-23 10:37:15.621 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:15 compute-1 podman[249013]: 2026-01-23 10:37:15.666292425 +0000 UTC m=+0.068900318 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Jan 23 10:37:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:15.841 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:15 compute-1 nova_compute[225705]: 2026-01-23 10:37:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:15 compute-1 nova_compute[225705]: 2026-01-23 10:37:15.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:37:15 compute-1 nova_compute[225705]: 2026-01-23 10:37:15.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:37:15 compute-1 nova_compute[225705]: 2026-01-23 10:37:15.911 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:37:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:16.131 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:16 compute-1 ceph-mon[80126]: pgmap v1327: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:16 compute-1 nova_compute[225705]: 2026-01-23 10:37:16.905 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:17 compute-1 nova_compute[225705]: 2026-01-23 10:37:17.499 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:17.844 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:18.134 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:19 compute-1 ceph-mon[80126]: pgmap v1328: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:19.847 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:19 compute-1 nova_compute[225705]: 2026-01-23 10:37:19.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:19 compute-1 nova_compute[225705]: 2026-01-23 10:37:19.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:19 compute-1 nova_compute[225705]: 2026-01-23 10:37:19.942 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:37:19 compute-1 nova_compute[225705]: 2026-01-23 10:37:19.943 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:37:19 compute-1 nova_compute[225705]: 2026-01-23 10:37:19.943 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:37:19 compute-1 nova_compute[225705]: 2026-01-23 10:37:19.944 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:37:19 compute-1 nova_compute[225705]: 2026-01-23 10:37:19.944 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:37:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:20.137 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:37:20 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:37:20 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2129370149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:20 compute-1 nova_compute[225705]: 2026-01-23 10:37:20.432 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.488s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:37:20 compute-1 nova_compute[225705]: 2026-01-23 10:37:20.604 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:37:20 compute-1 nova_compute[225705]: 2026-01-23 10:37:20.606 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4849MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:37:20 compute-1 nova_compute[225705]: 2026-01-23 10:37:20.606 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:37:20 compute-1 nova_compute[225705]: 2026-01-23 10:37:20.607 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:37:20 compute-1 nova_compute[225705]: 2026-01-23 10:37:20.625 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:20 compute-1 nova_compute[225705]: 2026-01-23 10:37:20.745 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:37:20 compute-1 nova_compute[225705]: 2026-01-23 10:37:20.746 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:37:20 compute-1 nova_compute[225705]: 2026-01-23 10:37:20.805 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:37:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:37:21 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3810051989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:21 compute-1 nova_compute[225705]: 2026-01-23 10:37:21.263 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:37:21 compute-1 nova_compute[225705]: 2026-01-23 10:37:21.270 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:37:21 compute-1 nova_compute[225705]: 2026-01-23 10:37:21.289 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:37:21 compute-1 nova_compute[225705]: 2026-01-23 10:37:21.291 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:37:21 compute-1 nova_compute[225705]: 2026-01-23 10:37:21.292 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.685s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:37:21 compute-1 ceph-mon[80126]: pgmap v1329: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:21 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2129370149' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:21 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3810051989' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:21.854 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:22.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:22 compute-1 nova_compute[225705]: 2026-01-23 10:37:22.502 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:22 compute-1 ceph-mon[80126]: pgmap v1330: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:23 compute-1 nova_compute[225705]: 2026-01-23 10:37:23.292 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:23 compute-1 nova_compute[225705]: 2026-01-23 10:37:23.292 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:23.857 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:23 compute-1 nova_compute[225705]: 2026-01-23 10:37:23.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:24.144 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:24 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:24 compute-1 ceph-mon[80126]: pgmap v1331: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:24 compute-1 nova_compute[225705]: 2026-01-23 10:37:24.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:24 compute-1 nova_compute[225705]: 2026-01-23 10:37:24.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:37:25 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3870511584' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:25 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/583159563' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:25 compute-1 nova_compute[225705]: 2026-01-23 10:37:25.629 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:25.861 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:25 compute-1 nova_compute[225705]: 2026-01-23 10:37:25.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:26.147 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:26 compute-1 ceph-mon[80126]: pgmap v1332: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/574915668' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3220990230' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:37:26 compute-1 nova_compute[225705]: 2026-01-23 10:37:26.869 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:37:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:27 compute-1 nova_compute[225705]: 2026-01-23 10:37:27.505 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:27.864 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:28.149 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:29 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:29 compute-1 ceph-mon[80126]: pgmap v1333: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:29 compute-1 sudo[249083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:37:29 compute-1 sudo[249083]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:29 compute-1 sudo[249083]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:29.867 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:30.152 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:30 compute-1 nova_compute[225705]: 2026-01-23 10:37:30.633 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:30 compute-1 ceph-mon[80126]: pgmap v1334: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:31.870 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:32.156 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:32 compute-1 nova_compute[225705]: 2026-01-23 10:37:32.507 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:32 compute-1 podman[249109]: 2026-01-23 10:37:32.684646214 +0000 UTC m=+0.086920764 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Jan 23 10:37:33 compute-1 ceph-mon[80126]: pgmap v1335: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:37:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:33.874 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:37:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:34.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:35 compute-1 sudo[249135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:37:35 compute-1 sudo[249135]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:35 compute-1 sudo[249135]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:35 compute-1 ceph-mon[80126]: pgmap v1336: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:37:35 compute-1 sudo[249160]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:37:35 compute-1 sudo[249160]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:35 compute-1 nova_compute[225705]: 2026-01-23 10:37:35.637 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:35 compute-1 sudo[249160]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:35.876 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:36.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:37 compute-1 nova_compute[225705]: 2026-01-23 10:37:37.511 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:37 compute-1 ceph-mon[80126]: pgmap v1337: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:37.880 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:38.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:39 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:39.882 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:40 compute-1 ceph-mon[80126]: pgmap v1338: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:37:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:40.166 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:37:40 compute-1 nova_compute[225705]: 2026-01-23 10:37:40.690 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:41 compute-1 ceph-mon[80126]: pgmap v1339: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:41 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:37:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:41.886 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:37:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:42.169 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:42 compute-1 nova_compute[225705]: 2026-01-23 10:37:42.513 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:42 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:37:42 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:37:42 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:42 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:42 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:37:42 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:37:42 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:37:43 compute-1 ceph-mon[80126]: pgmap v1340: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 775 B/s rd, 0 op/s
Jan 23 10:37:43 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:43 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:43 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:43.890 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:44.173 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:44 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:44 compute-1 ceph-mon[80126]: pgmap v1341: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 517 B/s rd, 0 op/s
Jan 23 10:37:45 compute-1 nova_compute[225705]: 2026-01-23 10:37:45.742 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:45 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:45 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:45 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:45.893 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:46.174 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:46 compute-1 podman[249222]: 2026-01-23 10:37:46.647370425 +0000 UTC m=+0.054171033 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Jan 23 10:37:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:47 compute-1 ceph-mon[80126]: pgmap v1342: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 517 B/s rd, 0 op/s
Jan 23 10:37:47 compute-1 nova_compute[225705]: 2026-01-23 10:37:47.515 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:47 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:47 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:47 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:47.896 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:48.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #76. Immutable memtables: 0.
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.073462) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 45] Flushing memtable with next log file: 76
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669073530, "job": 45, "event": "flush_started", "num_memtables": 1, "num_entries": 1357, "num_deletes": 257, "total_data_size": 3394338, "memory_usage": 3463760, "flush_reason": "Manual Compaction"}
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 45] Level-0 flush table #77: started
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669360911, "cf_name": "default", "job": 45, "event": "table_file_creation", "file_number": 77, "file_size": 2198714, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 39382, "largest_seqno": 40733, "table_properties": {"data_size": 2192833, "index_size": 3208, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1605, "raw_key_size": 12447, "raw_average_key_size": 19, "raw_value_size": 2180973, "raw_average_value_size": 3450, "num_data_blocks": 137, "num_entries": 632, "num_filter_entries": 632, "num_deletions": 257, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164555, "oldest_key_time": 1769164555, "file_creation_time": 1769164669, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 77, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 45] Flush lasted 287509 microseconds, and 9364 cpu microseconds.
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.360968) [db/flush_job.cc:967] [default] [JOB 45] Level-0 flush table #77: 2198714 bytes OK
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.360992) [db/memtable_list.cc:519] [default] Level-0 commit table #77 started
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.402440) [db/memtable_list.cc:722] [default] Level-0 commit table #77: memtable #1 done
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.402542) EVENT_LOG_v1 {"time_micros": 1769164669402479, "job": 45, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.402573) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 45] Try to delete WAL files size 3387978, prev total WAL file size 3405698, number of live WAL files 2.
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000073.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.403969) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0031303130' seq:72057594037927935, type:22 .. '6C6F676D0031323633' seq:0, type:0; will stop at (end)
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 46] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 45 Base level 0, inputs: [77(2147KB)], [75(12MB)]
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669404063, "job": 46, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [77], "files_L6": [75], "score": -1, "input_data_size": 15313864, "oldest_snapshot_seqno": -1}
Jan 23 10:37:49 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 46] Generated table #78: 6942 keys, 15161136 bytes, temperature: kUnknown
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669506052, "cf_name": "default", "job": 46, "event": "table_file_creation", "file_number": 78, "file_size": 15161136, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 15114592, "index_size": 28064, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17413, "raw_key_size": 182798, "raw_average_key_size": 26, "raw_value_size": 14989250, "raw_average_value_size": 2159, "num_data_blocks": 1100, "num_entries": 6942, "num_filter_entries": 6942, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164669, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 78, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.507212) [db/compaction/compaction_job.cc:1663] [default] [JOB 46] Compacted 1@0 + 1@6 files to L6 => 15161136 bytes
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.508896) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 150.0 rd, 148.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 12.5 +0.0 blob) out(14.5 +0.0 blob), read-write-amplify(13.9) write-amplify(6.9) OK, records in: 7472, records dropped: 530 output_compression: NoCompression
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.508927) EVENT_LOG_v1 {"time_micros": 1769164669508915, "job": 46, "event": "compaction_finished", "compaction_time_micros": 102059, "compaction_time_cpu_micros": 37232, "output_level": 6, "num_output_files": 1, "total_output_size": 15161136, "num_input_records": 7472, "num_output_records": 6942, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000077.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669509437, "job": 46, "event": "table_file_deletion", "file_number": 77}
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000075.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164669512107, "job": 46, "event": "table_file_deletion", "file_number": 75}
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.403833) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512149) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512154) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512156) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512158) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:37:49.512160) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:37:49 compute-1 sudo[249243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:37:49 compute-1 sudo[249243]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:49 compute-1 sudo[249243]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:49 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:49 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:49 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:49.899 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:49 compute-1 ceph-mon[80126]: pgmap v1343: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 775 B/s rd, 0 op/s
Jan 23 10:37:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:50.178 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:50 compute-1 nova_compute[225705]: 2026-01-23 10:37:50.759 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:51 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/980483264' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:37:51 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/980483264' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:37:51 compute-1 ceph-mon[80126]: pgmap v1344: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 517 B/s rd, 0 op/s
Jan 23 10:37:51 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:37:51 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:51 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:51 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:51.903 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:52.181 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:52 compute-1 nova_compute[225705]: 2026-01-23 10:37:52.518 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:53 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:53 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:53 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:53.906 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:54.184 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:54 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:37:55.071 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:37:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:37:55.072 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:37:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:37:55.072 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:37:55 compute-1 ceph-mon[80126]: pgmap v1345: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 775 B/s rd, 0 op/s
Jan 23 10:37:55 compute-1 nova_compute[225705]: 2026-01-23 10:37:55.833 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:55 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:55 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:55 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:55.910 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:37:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:56.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:37:56 compute-1 sudo[249271]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:37:56 compute-1 sudo[249271]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:37:56 compute-1 sudo[249271]: pam_unix(sudo:session): session closed for user root
Jan 23 10:37:56 compute-1 ceph-mon[80126]: pgmap v1346: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:56 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:37:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:37:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:37:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:37:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:37:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:37:57 compute-1 nova_compute[225705]: 2026-01-23 10:37:57.520 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:37:57 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:57 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:57 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:57.913 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:58 compute-1 ceph-mon[80126]: pgmap v1347: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:37:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:37:58.190 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:37:59 compute-1 ceph-mon[80126]: pgmap v1348: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:37:59 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:37:59 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:37:59 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:37:59 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:37:59.916 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:00.193 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:00 compute-1 nova_compute[225705]: 2026-01-23 10:38:00.846 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:00 compute-1 ceph-mon[80126]: pgmap v1349: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:01 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:01 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:01 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:01.920 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:02.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:02 compute-1 nova_compute[225705]: 2026-01-23 10:38:02.522 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:03 compute-1 ceph-mon[80126]: pgmap v1350: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:03 compute-1 systemd[1]: Starting dnf makecache...
Jan 23 10:38:03 compute-1 podman[249300]: 2026-01-23 10:38:03.697378857 +0000 UTC m=+0.100779670 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Jan 23 10:38:03 compute-1 dnf[249301]: Metadata cache refreshed recently.
Jan 23 10:38:03 compute-1 systemd[1]: dnf-makecache.service: Deactivated successfully.
Jan 23 10:38:03 compute-1 systemd[1]: Finished dnf makecache.
Jan 23 10:38:03 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:03 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:03 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:03.923 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:04.197 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:05 compute-1 ceph-mon[80126]: pgmap v1351: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:38:05 compute-1 nova_compute[225705]: 2026-01-23 10:38:05.885 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:05 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:05 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:05 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:05.927 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:06.200 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:07 compute-1 nova_compute[225705]: 2026-01-23 10:38:07.525 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:07 compute-1 ceph-mon[80126]: pgmap v1352: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:07 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:07 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:07 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:07.932 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:38:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:08.202 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:38:08 compute-1 ceph-mon[80126]: pgmap v1353: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:09 compute-1 sudo[249331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:38:09 compute-1 sudo[249331]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:09 compute-1 sudo[249331]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:09 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:09 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:38:09 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:09.935 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:38:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.002000063s ======
Jan 23 10:38:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:10.204 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.002000063s
Jan 23 10:38:10 compute-1 nova_compute[225705]: 2026-01-23 10:38:10.925 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:11 compute-1 ceph-mon[80126]: pgmap v1354: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:11 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:11 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:11 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:11.939 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:12.207 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:12 compute-1 nova_compute[225705]: 2026-01-23 10:38:12.528 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:13 compute-1 ceph-mon[80126]: pgmap v1355: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:13 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:13 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:13 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:13.943 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:14.210 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:15 compute-1 ceph-mon[80126]: pgmap v1356: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:15 compute-1 nova_compute[225705]: 2026-01-23 10:38:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:15 compute-1 nova_compute[225705]: 2026-01-23 10:38:15.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:38:15 compute-1 nova_compute[225705]: 2026-01-23 10:38:15.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:38:15 compute-1 nova_compute[225705]: 2026-01-23 10:38:15.891 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:38:15 compute-1 nova_compute[225705]: 2026-01-23 10:38:15.930 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:15 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:15 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:15 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:15.949 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:16.214 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:17 compute-1 ceph-mon[80126]: pgmap v1357: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:17 compute-1 nova_compute[225705]: 2026-01-23 10:38:17.531 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:17 compute-1 podman[249360]: 2026-01-23 10:38:17.7074783 +0000 UTC m=+0.092225550 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Jan 23 10:38:17 compute-1 nova_compute[225705]: 2026-01-23 10:38:17.886 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:17 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:17 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:17 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:17.952 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:18.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:19 compute-1 sshd-session[249379]: Invalid user sol from 45.148.10.240 port 58324
Jan 23 10:38:19 compute-1 sshd-session[249379]: Connection closed by invalid user sol 45.148.10.240 port 58324 [preauth]
Jan 23 10:38:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:19 compute-1 ceph-mon[80126]: pgmap v1358: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:19 compute-1 nova_compute[225705]: 2026-01-23 10:38:19.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:19 compute-1 nova_compute[225705]: 2026-01-23 10:38:19.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:19 compute-1 nova_compute[225705]: 2026-01-23 10:38:19.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:38:19 compute-1 nova_compute[225705]: 2026-01-23 10:38:19.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:38:19 compute-1 nova_compute[225705]: 2026-01-23 10:38:19.903 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:38:19 compute-1 nova_compute[225705]: 2026-01-23 10:38:19.904 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:38:19 compute-1 nova_compute[225705]: 2026-01-23 10:38:19.904 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:38:19 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:19 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:19 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:19.955 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:20.219 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:20 compute-1 nova_compute[225705]: 2026-01-23 10:38:20.934 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:38:21 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2310328428' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.077 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 1.174s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:38:21 compute-1 ceph-mon[80126]: pgmap v1359: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.287 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.289 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4836MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.290 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.292 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.361 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.361 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.437 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:38:21 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:38:21 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/387035444' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.937 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.500s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.945 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:38:21 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:21 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:38:21 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:21.958 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.964 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.967 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:38:21 compute-1 nova_compute[225705]: 2026-01-23 10:38:21.967 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.676s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:38:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:22 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2310328428' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:22 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/387035444' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:22.221 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:22 compute-1 nova_compute[225705]: 2026-01-23 10:38:22.533 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:23 compute-1 ceph-mon[80126]: pgmap v1360: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:23 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:23 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:23 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:23.960 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:24.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:24 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3620960525' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:24 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:25 compute-1 ceph-mon[80126]: pgmap v1361: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:25 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/15073574' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:25 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/542188124' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:25 compute-1 nova_compute[225705]: 2026-01-23 10:38:25.936 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:25 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:25 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:25 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:25.964 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:25 compute-1 nova_compute[225705]: 2026-01-23 10:38:25.967 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:25 compute-1 nova_compute[225705]: 2026-01-23 10:38:25.968 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:25 compute-1 nova_compute[225705]: 2026-01-23 10:38:25.968 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:25 compute-1 nova_compute[225705]: 2026-01-23 10:38:25.968 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:25 compute-1 nova_compute[225705]: 2026-01-23 10:38:25.969 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:38:25 compute-1 nova_compute[225705]: 2026-01-23 10:38:25.969 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:38:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:26.228 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:26 compute-1 ceph-mon[80126]: pgmap v1362: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3162061129' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:38:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:27 compute-1 nova_compute[225705]: 2026-01-23 10:38:27.535 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:27 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:27 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:27 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:27.969 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:28.229 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:28 compute-1 ceph-mon[80126]: pgmap v1363: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:29 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:29 compute-1 sudo[249431]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:38:29 compute-1 sudo[249431]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:29 compute-1 sudo[249431]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:29 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:29 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:29 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:29.972 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:30.231 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:30 compute-1 nova_compute[225705]: 2026-01-23 10:38:30.938 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:31 compute-1 ceph-mon[80126]: pgmap v1364: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:31 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:31 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:31 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:31.975 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:32.233 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:32 compute-1 nova_compute[225705]: 2026-01-23 10:38:32.539 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:33 compute-1 ceph-mon[80126]: pgmap v1365: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:33 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:33 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:33 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:33.980 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:34.236 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:34 compute-1 podman[249458]: 2026-01-23 10:38:34.711098523 +0000 UTC m=+0.110435334 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 10:38:35 compute-1 ceph-mon[80126]: pgmap v1366: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:38:35 compute-1 nova_compute[225705]: 2026-01-23 10:38:35.942 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:35 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:35 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:35 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:35.984 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:36.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:37 compute-1 nova_compute[225705]: 2026-01-23 10:38:37.541 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:37 compute-1 ceph-mon[80126]: pgmap v1367: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:37 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:37 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:37 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:37.988 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:38.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:38 compute-1 ceph-mon[80126]: pgmap v1368: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:39 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:39 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:39 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:39 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:39.992 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:40.243 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:40 compute-1 nova_compute[225705]: 2026-01-23 10:38:40.945 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:40 compute-1 ceph-mon[80126]: pgmap v1369: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:41 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:41 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:41 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:41.996 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:42.246 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:42 compute-1 nova_compute[225705]: 2026-01-23 10:38:42.543 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:43 compute-1 ceph-mon[80126]: pgmap v1370: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:44.000 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:44.249 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:44 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:45 compute-1 ceph-mon[80126]: pgmap v1371: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:45 compute-1 nova_compute[225705]: 2026-01-23 10:38:45.949 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:46.003 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:46.251 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:47 compute-1 ceph-mon[80126]: pgmap v1372: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:47 compute-1 nova_compute[225705]: 2026-01-23 10:38:47.545 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:38:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:48.008 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:38:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:48.254 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:48 compute-1 podman[249492]: 2026-01-23 10:38:48.669644645 +0000 UTC m=+0.060933416 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Jan 23 10:38:49 compute-1 ceph-mon[80126]: pgmap v1373: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3411057269' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:38:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/3411057269' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:38:49 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:50.012 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:50.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:38:50 compute-1 nova_compute[225705]: 2026-01-23 10:38:50.953 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:51 compute-1 sudo[249513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:38:51 compute-1 sudo[249513]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:51 compute-1 sudo[249513]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:51 compute-1 ceph-mon[80126]: pgmap v1374: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:38:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:52.016 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:38:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:38:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:52.258 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:38:52 compute-1 nova_compute[225705]: 2026-01-23 10:38:52.547 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:53 compute-1 ceph-mon[80126]: pgmap v1375: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:38:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:54.021 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:54.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:54 compute-1 ceph-mon[80126]: pgmap v1376: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:54 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:38:55.072 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:38:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:38:55.073 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:38:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:38:55.073 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:38:55 compute-1 nova_compute[225705]: 2026-01-23 10:38:55.957 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:56.024 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:56.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:56 compute-1 sudo[249541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:38:56 compute-1 sudo[249541]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:56 compute-1 sudo[249541]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:56 compute-1 sudo[249566]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:38:56 compute-1 sudo[249566]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:38:56 compute-1 ceph-mon[80126]: pgmap v1377: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:38:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:38:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:38:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:38:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:38:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:38:57 compute-1 sudo[249566]: pam_unix(sudo:session): session closed for user root
Jan 23 10:38:57 compute-1 nova_compute[225705]: 2026-01-23 10:38:57.548 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:38:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:38:58.029 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:38:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:38:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:38:58.265 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:38:58 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:38:58 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:38:58 compute-1 ceph-mon[80126]: pgmap v1378: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 793 B/s rd, 0 op/s
Jan 23 10:38:58 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:38:58 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:38:58 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:38:58 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:38:58 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:38:59 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:38:59 compute-1 ceph-mon[80126]: pgmap v1379: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 529 B/s rd, 0 op/s
Jan 23 10:39:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:00.032 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:00.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:00 compute-1 nova_compute[225705]: 2026-01-23 10:39:00.959 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:01 compute-1 ceph-mon[80126]: pgmap v1380: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 793 B/s rd, 0 op/s
Jan 23 10:39:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.003000095s ======
Jan 23 10:39:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:02.035 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.003000095s
Jan 23 10:39:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:02.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:02 compute-1 nova_compute[225705]: 2026-01-23 10:39:02.549 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:04.040 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:04 compute-1 sudo[249627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:39:04 compute-1 sudo[249627]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:39:04 compute-1 sudo[249627]: pam_unix(sudo:session): session closed for user root
Jan 23 10:39:04 compute-1 ceph-mon[80126]: pgmap v1381: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 529 B/s rd, 0 op/s
Jan 23 10:39:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:39:04 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:39:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:04.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:39:05 compute-1 podman[249653]: 2026-01-23 10:39:05.729536132 +0000 UTC m=+0.129121381 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Jan 23 10:39:05 compute-1 nova_compute[225705]: 2026-01-23 10:39:05.961 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:06.044 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:06.274 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:06 compute-1 ceph-mon[80126]: pgmap v1382: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 529 B/s rd, 0 op/s
Jan 23 10:39:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:07 compute-1 nova_compute[225705]: 2026-01-23 10:39:07.551 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:08.047 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:08.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:08 compute-1 ceph-mon[80126]: pgmap v1383: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 793 B/s rd, 0 op/s
Jan 23 10:39:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:09 compute-1 ceph-mon[80126]: pgmap v1384: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:10.051 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:10.278 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:10 compute-1 nova_compute[225705]: 2026-01-23 10:39:10.965 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:11 compute-1 sudo[249681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:39:11 compute-1 sudo[249681]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:39:11 compute-1 sudo[249681]: pam_unix(sudo:session): session closed for user root
Jan 23 10:39:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:12.054 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:12 compute-1 ceph-mon[80126]: pgmap v1385: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:12.281 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:12 compute-1 nova_compute[225705]: 2026-01-23 10:39:12.554 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:14.059 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:14 compute-1 ceph-mon[80126]: pgmap v1386: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:14.283 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:15 compute-1 nova_compute[225705]: 2026-01-23 10:39:15.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:15 compute-1 nova_compute[225705]: 2026-01-23 10:39:15.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:39:15 compute-1 nova_compute[225705]: 2026-01-23 10:39:15.876 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:39:15 compute-1 nova_compute[225705]: 2026-01-23 10:39:15.969 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:16.063 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:16 compute-1 nova_compute[225705]: 2026-01-23 10:39:16.088 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:39:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:16.286 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:16 compute-1 ceph-mon[80126]: pgmap v1387: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:17 compute-1 nova_compute[225705]: 2026-01-23 10:39:17.555 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:17 compute-1 ceph-mon[80126]: pgmap v1388: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:18.067 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:18 compute-1 nova_compute[225705]: 2026-01-23 10:39:18.081 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:18.290 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:19 compute-1 podman[249711]: 2026-01-23 10:39:19.654224368 +0000 UTC m=+0.055315150 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 10:39:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:20.072 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:39:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:20.293 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:39:20 compute-1 ceph-mon[80126]: pgmap v1389: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:20 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:39:20 compute-1 nova_compute[225705]: 2026-01-23 10:39:20.973 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:21 compute-1 nova_compute[225705]: 2026-01-23 10:39:21.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:21 compute-1 nova_compute[225705]: 2026-01-23 10:39:21.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:21 compute-1 ceph-mon[80126]: pgmap v1390: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:21 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:22.075 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:22.295 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:22 compute-1 nova_compute[225705]: 2026-01-23 10:39:22.546 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:39:22 compute-1 nova_compute[225705]: 2026-01-23 10:39:22.547 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:39:22 compute-1 nova_compute[225705]: 2026-01-23 10:39:22.547 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:39:22 compute-1 nova_compute[225705]: 2026-01-23 10:39:22.547 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:39:22 compute-1 nova_compute[225705]: 2026-01-23 10:39:22.547 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:39:22 compute-1 nova_compute[225705]: 2026-01-23 10:39:22.564 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:23 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:39:23 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1085583944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.027 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.480s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.210 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.211 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4841MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.212 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.212 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:39:23 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1085583944' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.507 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.508 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.538 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing inventories for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.571 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating ProviderTree inventory for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.572 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Updating inventory in ProviderTree for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.585 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing aggregate associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.606 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Refreshing trait associations for resource provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7, traits: COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,HW_CPU_X86_SSE4A,HW_CPU_X86_SSSE3,HW_CPU_X86_BMI2,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_SECURITY_TPM_2_0,COMPUTE_SECURITY_TPM_1_2,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSE2,COMPUTE_RESCUE_BFV,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_AVX2,COMPUTE_DEVICE_TAGGING,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_MMX,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_F16C,COMPUTE_NET_ATTACH_INTERFACE,HW_CPU_X86_FMA3,COMPUTE_NODE,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_AESNI,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_SHA,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_CLMUL,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_ABM,COMPUTE_GRAPHICS_MODEL_NONE,HW_CPU_X86_SVM,HW_CPU_X86_SSE41,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VOLUME_MULTI_ATTACH _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Jan 23 10:39:23 compute-1 nova_compute[225705]: 2026-01-23 10:39:23.623 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:39:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:39:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:24.079 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:39:24 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:39:24 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2536873901' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:24 compute-1 nova_compute[225705]: 2026-01-23 10:39:24.101 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.477s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:39:24 compute-1 nova_compute[225705]: 2026-01-23 10:39:24.108 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:39:24 compute-1 nova_compute[225705]: 2026-01-23 10:39:24.137 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:39:24 compute-1 nova_compute[225705]: 2026-01-23 10:39:24.141 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:39:24 compute-1 nova_compute[225705]: 2026-01-23 10:39:24.142 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:39:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:24.298 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:24 compute-1 ceph-mon[80126]: pgmap v1391: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:24 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2536873901' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:24 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:25 compute-1 nova_compute[225705]: 2026-01-23 10:39:25.977 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:26.084 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:26.301 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1040346182' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3838063102' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/712866539' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3515020931' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:39:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:27 compute-1 nova_compute[225705]: 2026-01-23 10:39:27.559 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:28.087 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:28 compute-1 nova_compute[225705]: 2026-01-23 10:39:28.142 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:28 compute-1 nova_compute[225705]: 2026-01-23 10:39:28.142 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:28 compute-1 nova_compute[225705]: 2026-01-23 10:39:28.143 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:28 compute-1 nova_compute[225705]: 2026-01-23 10:39:28.143 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:28 compute-1 nova_compute[225705]: 2026-01-23 10:39:28.143 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:28 compute-1 nova_compute[225705]: 2026-01-23 10:39:28.143 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:39:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:28.304 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:28 compute-1 ceph-mon[80126]: pgmap v1392: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:28 compute-1 ceph-mon[80126]: pgmap v1393: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:29 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:29 compute-1 nova_compute[225705]: 2026-01-23 10:39:29.870 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:39:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:30.091 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:30.307 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:30 compute-1 ceph-mon[80126]: pgmap v1394: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:30 compute-1 nova_compute[225705]: 2026-01-23 10:39:30.981 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:31 compute-1 sudo[249779]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:39:31 compute-1 sudo[249779]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:39:31 compute-1 sudo[249779]: pam_unix(sudo:session): session closed for user root
Jan 23 10:39:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:32.094 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:32.310 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:32 compute-1 ceph-mon[80126]: pgmap v1395: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:32 compute-1 nova_compute[225705]: 2026-01-23 10:39:32.561 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:34 compute-1 ceph-mon[80126]: pgmap v1396: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:34.097 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:34.311 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:35 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:39:35 compute-1 nova_compute[225705]: 2026-01-23 10:39:35.985 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:36.099 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:36.315 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:36 compute-1 podman[249807]: 2026-01-23 10:39:36.708849686 +0000 UTC m=+0.109788353 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Jan 23 10:39:36 compute-1 ceph-mon[80126]: pgmap v1397: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:36 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:37 compute-1 nova_compute[225705]: 2026-01-23 10:39:37.563 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #79. Immutable memtables: 0.
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.785049) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 47] Flushing memtable with next log file: 79
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777785130, "job": 47, "event": "flush_started", "num_memtables": 1, "num_entries": 1275, "num_deletes": 251, "total_data_size": 3246737, "memory_usage": 3285344, "flush_reason": "Manual Compaction"}
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 47] Level-0 flush table #80: started
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777801585, "cf_name": "default", "job": 47, "event": "table_file_creation", "file_number": 80, "file_size": 2102623, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 40738, "largest_seqno": 42008, "table_properties": {"data_size": 2096978, "index_size": 3039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1541, "raw_key_size": 11955, "raw_average_key_size": 19, "raw_value_size": 2085740, "raw_average_value_size": 3482, "num_data_blocks": 131, "num_entries": 599, "num_filter_entries": 599, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164669, "oldest_key_time": 1769164669, "file_creation_time": 1769164777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 80, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 47] Flush lasted 16624 microseconds, and 7735 cpu microseconds.
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.801674) [db/flush_job.cc:967] [default] [JOB 47] Level-0 flush table #80: 2102623 bytes OK
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.801711) [db/memtable_list.cc:519] [default] Level-0 commit table #80 started
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.803773) [db/memtable_list.cc:722] [default] Level-0 commit table #80: memtable #1 done
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.803872) EVENT_LOG_v1 {"time_micros": 1769164777803851, "job": 47, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.803919) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 47] Try to delete WAL files size 3240717, prev total WAL file size 3240717, number of live WAL files 2.
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000076.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.805635) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730033323633' seq:72057594037927935, type:22 .. '7061786F730033353135' seq:0, type:0; will stop at (end)
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 48] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 47 Base level 0, inputs: [80(2053KB)], [78(14MB)]
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777805716, "job": 48, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [80], "files_L6": [78], "score": -1, "input_data_size": 17263759, "oldest_snapshot_seqno": -1}
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 48] Generated table #81: 7025 keys, 14949781 bytes, temperature: kUnknown
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777910804, "cf_name": "default", "job": 48, "event": "table_file_creation", "file_number": 81, "file_size": 14949781, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 14903190, "index_size": 27919, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17605, "raw_key_size": 185241, "raw_average_key_size": 26, "raw_value_size": 14776802, "raw_average_value_size": 2103, "num_data_blocks": 1086, "num_entries": 7025, "num_filter_entries": 7025, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164777, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 81, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.911245) [db/compaction/compaction_job.cc:1663] [default] [JOB 48] Compacted 1@0 + 1@6 files to L6 => 14949781 bytes
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.914471) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 164.1 rd, 142.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.0, 14.5 +0.0 blob) out(14.3 +0.0 blob), read-write-amplify(15.3) write-amplify(7.1) OK, records in: 7541, records dropped: 516 output_compression: NoCompression
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.914563) EVENT_LOG_v1 {"time_micros": 1769164777914539, "job": 48, "event": "compaction_finished", "compaction_time_micros": 105232, "compaction_time_cpu_micros": 36066, "output_level": 6, "num_output_files": 1, "total_output_size": 14949781, "num_input_records": 7541, "num_output_records": 7025, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000080.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777915422, "job": 48, "event": "table_file_deletion", "file_number": 80}
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000078.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164777918744, "job": 48, "event": "table_file_deletion", "file_number": 78}
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.805466) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918871) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918877) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918879) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918881) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:37 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:39:37.918883) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:39:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:39:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:38.104 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:39:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:38.317 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:38 compute-1 ceph-mon[80126]: pgmap v1398: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:39 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:40.108 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:40.321 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:40 compute-1 ceph-mon[80126]: pgmap v1399: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:41 compute-1 nova_compute[225705]: 2026-01-23 10:39:41.029 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:42.111 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:42.325 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:42 compute-1 nova_compute[225705]: 2026-01-23 10:39:42.564 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:42 compute-1 ceph-mon[80126]: pgmap v1400: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:44.115 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:44.327 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:44 compute-1 ceph-mon[80126]: pgmap v1401: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:44 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:46 compute-1 nova_compute[225705]: 2026-01-23 10:39:46.033 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:46.119 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:46.330 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:47 compute-1 ceph-mon[80126]: pgmap v1402: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:47 compute-1 nova_compute[225705]: 2026-01-23 10:39:47.565 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:39:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:48.123 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:39:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:48.334 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:48 compute-1 ceph-mon[80126]: pgmap v1403: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:49 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/786440771' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:39:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/786440771' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:39:49 compute-1 ceph-mon[80126]: pgmap v1404: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:39:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:50.129 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:39:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:50.337 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:50 compute-1 podman[249841]: 2026-01-23 10:39:50.672871711 +0000 UTC m=+0.069648011 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Jan 23 10:39:51 compute-1 nova_compute[225705]: 2026-01-23 10:39:51.037 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:51 compute-1 sudo[249862]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:39:51 compute-1 sudo[249862]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:39:51 compute-1 sudo[249862]: pam_unix(sudo:session): session closed for user root
Jan 23 10:39:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:39:51 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3004.3 total, 600.0 interval
                                           Cumulative writes: 14K writes, 51K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                           Cumulative WAL: 14K writes, 4378 syncs, 3.33 writes per sync, written: 0.04 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 764 writes, 1182 keys, 764 commit groups, 1.0 writes per commit group, ingest: 0.38 MB, 0.00 MB/s
                                           Interval WAL: 764 writes, 370 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:39:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:52.133 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:52.340 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:52 compute-1 nova_compute[225705]: 2026-01-23 10:39:52.567 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:53 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:39:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:54.138 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:54 compute-1 ceph-mon[80126]: pgmap v1405: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:54 compute-1 ceph-mon[80126]: pgmap v1406: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:39:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:54.343 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:39:54 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:39:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:39:55.074 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:39:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:39:55.075 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:39:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:39:55.075 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:39:55 compute-1 ceph-mon[80126]: pgmap v1407: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:39:56 compute-1 nova_compute[225705]: 2026-01-23 10:39:56.041 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:56.141 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:39:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:56.346 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:39:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:39:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:39:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:39:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:39:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:39:57 compute-1 ceph-mon[80126]: pgmap v1408: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:39:57 compute-1 nova_compute[225705]: 2026-01-23 10:39:57.569 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:39:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:39:58.145 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:39:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:39:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:39:58.349 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:39:59 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:00.150 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:40:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:00.352 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:40:00 compute-1 ceph-mon[80126]: pgmap v1409: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:00 compute-1 ceph-mon[80126]: overall HEALTH_WARN 2 OSD(s) experiencing slow operations in BlueStore; 2 failed cephadm daemon(s)
Jan 23 10:40:01 compute-1 nova_compute[225705]: 2026-01-23 10:40:01.046 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:02 compute-1 ceph-mon[80126]: pgmap v1410: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:02.153 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:02.355 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:02 compute-1 nova_compute[225705]: 2026-01-23 10:40:02.577 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:04 compute-1 ceph-mon[80126]: pgmap v1411: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:04.158 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:04 compute-1 sudo[249894]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:40:04 compute-1 sudo[249894]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:04 compute-1 sudo[249894]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:04.357 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:04 compute-1 sudo[249919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:40:04 compute-1 sudo[249919]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:04 compute-1 sudo[249919]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:40:06 compute-1 ceph-mon[80126]: pgmap v1412: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:40:06 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:06 compute-1 nova_compute[225705]: 2026-01-23 10:40:06.049 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:06.161 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000031s ======
Jan 23 10:40:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:06.359 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000031s
Jan 23 10:40:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:07 compute-1 ceph-mon[80126]: pgmap v1413: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 592 B/s rd, 0 op/s
Jan 23 10:40:07 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:07 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:40:07 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:40:07 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:07 compute-1 nova_compute[225705]: 2026-01-23 10:40:07.579 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:07 compute-1 podman[249978]: 2026-01-23 10:40:07.722532183 +0000 UTC m=+0.120878362 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Jan 23 10:40:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:08.164 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:40:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:08.362 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:40:08 compute-1 ceph-mon[80126]: pgmap v1414: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:10.168 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:10.365 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:10 compute-1 sudo[250005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Jan 23 10:40:10 compute-1 sudo[250005]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:10 compute-1 sudo[250005]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:11 compute-1 nova_compute[225705]: 2026-01-23 10:40:11.083 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:11 compute-1 ceph-mon[80126]: pgmap v1415: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:11 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:11 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:40:11 compute-1 sudo[250030]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:40:11 compute-1 sudo[250030]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:11 compute-1 sudo[250030]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:12.172 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:12.367 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:12 compute-1 nova_compute[225705]: 2026-01-23 10:40:12.582 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:13 compute-1 ceph-mon[80126]: pgmap v1416: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:14.176 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:40:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:14.370 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:40:14 compute-1 ceph-mon[80126]: pgmap v1417: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:15 compute-1 nova_compute[225705]: 2026-01-23 10:40:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:15 compute-1 nova_compute[225705]: 2026-01-23 10:40:15.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:40:15 compute-1 nova_compute[225705]: 2026-01-23 10:40:15.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:40:15 compute-1 nova_compute[225705]: 2026-01-23 10:40:15.890 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:40:16 compute-1 nova_compute[225705]: 2026-01-23 10:40:16.127 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:16.180 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:16.373 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:16 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:17 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:17 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:17 compute-1 ceph-mon[80126]: pgmap v1418: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 593 B/s rd, 0 op/s
Jan 23 10:40:17 compute-1 nova_compute[225705]: 2026-01-23 10:40:17.584 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:40:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:18.183 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:40:18 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:18 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:18 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:18.376 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:18 compute-1 ceph-mon[80126]: pgmap v1419: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:19 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:19 compute-1 nova_compute[225705]: 2026-01-23 10:40:19.884 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:40:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:20.187 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:40:20 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:20 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:40:20 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:20.379 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:40:21 compute-1 nova_compute[225705]: 2026-01-23 10:40:21.131 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:21 compute-1 ceph-mon[80126]: pgmap v1420: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:21 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:40:21 compute-1 podman[250061]: 2026-01-23 10:40:21.654822308 +0000 UTC m=+0.055019920 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true)
Jan 23 10:40:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:22 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:22 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:22.192 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:22 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:22 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000032s ======
Jan 23 10:40:22 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:22.382 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000032s
Jan 23 10:40:22 compute-1 nova_compute[225705]: 2026-01-23 10:40:22.586 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:22 compute-1 nova_compute[225705]: 2026-01-23 10:40:22.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:23 compute-1 ceph-mon[80126]: pgmap v1421: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:23 compute-1 nova_compute[225705]: 2026-01-23 10:40:23.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:23 compute-1 nova_compute[225705]: 2026-01-23 10:40:23.900 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:40:23 compute-1 nova_compute[225705]: 2026-01-23 10:40:23.901 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:40:23 compute-1 nova_compute[225705]: 2026-01-23 10:40:23.902 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:40:23 compute-1 nova_compute[225705]: 2026-01-23 10:40:23.902 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Auditing locally available compute resources for compute-1.ctlplane.example.com (node: compute-1.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Jan 23 10:40:23 compute-1 nova_compute[225705]: 2026-01-23 10:40:23.902 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:40:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:24.195 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:24 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:40:24 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1646194779' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:24 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:24 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:24 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:24.385 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:24 compute-1 nova_compute[225705]: 2026-01-23 10:40:24.405 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.502s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:40:24 compute-1 nova_compute[225705]: 2026-01-23 10:40:24.584 225709 WARNING nova.virt.libvirt.driver [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Jan 23 10:40:24 compute-1 nova_compute[225705]: 2026-01-23 10:40:24.585 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Hypervisor/Node resource view: name=compute-1.ctlplane.example.com free_ram=4849MB free_disk=59.988277435302734GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Jan 23 10:40:24 compute-1 nova_compute[225705]: 2026-01-23 10:40:24.585 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:40:24 compute-1 nova_compute[225705]: 2026-01-23 10:40:24.586 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:40:24 compute-1 nova_compute[225705]: 2026-01-23 10:40:24.837 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Jan 23 10:40:24 compute-1 nova_compute[225705]: 2026-01-23 10:40:24.837 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Final resource view: name=compute-1.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=59GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Jan 23 10:40:24 compute-1 nova_compute[225705]: 2026-01-23 10:40:24.857 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Jan 23 10:40:24 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #82. Immutable memtables: 0.
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.260092) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:856] [default] [JOB 49] Flushing memtable with next log file: 82
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825260160, "job": 49, "event": "flush_started", "num_memtables": 1, "num_entries": 710, "num_deletes": 251, "total_data_size": 1512488, "memory_usage": 1528584, "flush_reason": "Manual Compaction"}
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:885] [default] [JOB 49] Level-0 flush table #83: started
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825281077, "cf_name": "default", "job": 49, "event": "table_file_creation", "file_number": 83, "file_size": 689452, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 42013, "largest_seqno": 42718, "table_properties": {"data_size": 686359, "index_size": 1001, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1029, "raw_key_size": 8261, "raw_average_key_size": 20, "raw_value_size": 679885, "raw_average_value_size": 1712, "num_data_blocks": 43, "num_entries": 397, "num_filter_entries": 397, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769164778, "oldest_key_time": 1769164778, "file_creation_time": 1769164825, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 83, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 49] Flush lasted 21088 microseconds, and 3957 cpu microseconds.
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.281168) [db/flush_job.cc:967] [default] [JOB 49] Level-0 flush table #83: 689452 bytes OK
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.281213) [db/memtable_list.cc:519] [default] Level-0 commit table #83 started
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.283325) [db/memtable_list.cc:722] [default] Level-0 commit table #83: memtable #1 done
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.283417) EVENT_LOG_v1 {"time_micros": 1769164825283398, "job": 49, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.283462) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 49] Try to delete WAL files size 1508683, prev total WAL file size 1508683, number of live WAL files 2.
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000079.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.284768) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740031323535' seq:72057594037927935, type:22 .. '6D6772737461740031353037' seq:0, type:0; will stop at (end)
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 50] Compacting 1@0 + 1@6 files to L6, score -1.00
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 49 Base level 0, inputs: [83(673KB)], [81(14MB)]
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825284859, "job": 50, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [83], "files_L6": [81], "score": -1, "input_data_size": 15639233, "oldest_snapshot_seqno": -1}
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 50] Generated table #84: 6923 keys, 11743705 bytes, temperature: kUnknown
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825368046, "cf_name": "default", "job": 50, "event": "table_file_creation", "file_number": 84, "file_size": 11743705, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 11702287, "index_size": 23002, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 17349, "raw_key_size": 183324, "raw_average_key_size": 26, "raw_value_size": 11582026, "raw_average_value_size": 1672, "num_data_blocks": 892, "num_entries": 6923, "num_filter_entries": 6923, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769161847, "oldest_key_time": 0, "file_creation_time": 1769164825, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "1897ab4a-12ed-4850-8782-7d536e06cd96", "db_session_id": "PH7FUS34ITA44089QBF9", "orig_file_number": 84, "seqno_to_time_mapping": "N/A"}}
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.368380) [db/compaction/compaction_job.cc:1663] [default] [JOB 50] Compacted 1@0 + 1@6 files to L6 => 11743705 bytes
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.370155) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 187.8 rd, 141.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 14.3 +0.0 blob) out(11.2 +0.0 blob), read-write-amplify(39.7) write-amplify(17.0) OK, records in: 7422, records dropped: 499 output_compression: NoCompression
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.370187) EVENT_LOG_v1 {"time_micros": 1769164825370173, "job": 50, "event": "compaction_finished", "compaction_time_micros": 83286, "compaction_time_cpu_micros": 27282, "output_level": 6, "num_output_files": 1, "total_output_size": 11743705, "num_input_records": 7422, "num_output_records": 6923, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000083.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825370583, "job": 50, "event": "table_file_deletion", "file_number": 83}
Jan 23 10:40:25 compute-1 ceph-mon[80126]: pgmap v1422: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:25 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1646194779' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-compute-1/store.db/000081.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769164825375102, "job": 50, "event": "table_file_deletion", "file_number": 81}
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.284544) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375246) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375251) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375253) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375254) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-1 ceph-mon[80126]: rocksdb: (Original Log Time 2026/01/23-10:40:25.375255) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Jan 23 10:40:25 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Jan 23 10:40:25 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4269113456' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:25 compute-1 nova_compute[225705]: 2026-01-23 10:40:25.425 225709 DEBUG oslo_concurrency.processutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.568s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Jan 23 10:40:25 compute-1 nova_compute[225705]: 2026-01-23 10:40:25.432 225709 DEBUG nova.compute.provider_tree [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed in ProviderTree for provider: b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Jan 23 10:40:25 compute-1 nova_compute[225705]: 2026-01-23 10:40:25.446 225709 DEBUG nova.scheduler.client.report [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Inventory has not changed for provider b22b6ed5-7bca-42dc-9b99-6f2ad6853af7 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 59, 'reserved': 1, 'min_unit': 1, 'max_unit': 59, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Jan 23 10:40:25 compute-1 nova_compute[225705]: 2026-01-23 10:40:25.448 225709 DEBUG nova.compute.resource_tracker [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Compute_service record updated for compute-1.ctlplane.example.com:compute-1.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Jan 23 10:40:25 compute-1 nova_compute[225705]: 2026-01-23 10:40:25.448 225709 DEBUG oslo_concurrency.lockutils [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.863s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:40:26 compute-1 nova_compute[225705]: 2026-01-23 10:40:26.134 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:40:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:26.199 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:40:26 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:26 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:26 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:26.387 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4269113456' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1984292967' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:26 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1373761955' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:26 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:27 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:27 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:27 compute-1 nova_compute[225705]: 2026-01-23 10:40:27.449 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:27 compute-1 nova_compute[225705]: 2026-01-23 10:40:27.450 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:27 compute-1 nova_compute[225705]: 2026-01-23 10:40:27.587 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:27 compute-1 ceph-mon[80126]: pgmap v1423: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2247912976' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:27 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1342687352' entity='client.openstack' cmd=[{"prefix": "df", "format": "json"}]: dispatch
Jan 23 10:40:27 compute-1 nova_compute[225705]: 2026-01-23 10:40:27.876 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:28.203 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:28 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:28 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:28 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:28.389 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:28 compute-1 nova_compute[225705]: 2026-01-23 10:40:28.875 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:29 compute-1 ceph-mon[80126]: pgmap v1424: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:29 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:29 compute-1 nova_compute[225705]: 2026-01-23 10:40:29.873 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:40:29 compute-1 nova_compute[225705]: 2026-01-23 10:40:29.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Jan 23 10:40:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:40:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:30.208 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:40:30 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:30 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:40:30 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:30.392 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:40:31 compute-1 nova_compute[225705]: 2026-01-23 10:40:31.138 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:31 compute-1 ceph-mon[80126]: pgmap v1425: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:31 compute-1 sudo[250129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:40:31 compute-1 sudo[250129]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:31 compute-1 sudo[250129]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:31 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:32 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:32 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:40:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:32.213 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:40:32 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:32 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:32 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:32.395 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:32 compute-1 nova_compute[225705]: 2026-01-23 10:40:32.589 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:33 compute-1 ceph-mon[80126]: pgmap v1426: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:40:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:34.216 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:40:34 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:34 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:34 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:34.398 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:34 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:35 compute-1 ceph-mon[80126]: pgmap v1427: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:36 compute-1 nova_compute[225705]: 2026-01-23 10:40:36.142 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:36 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:40:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:36.220 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:36 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:36 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:40:36 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:36.401 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:40:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:37 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:37 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:37 compute-1 ceph-mon[80126]: pgmap v1428: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:37 compute-1 nova_compute[225705]: 2026-01-23 10:40:37.591 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:38.224 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:38 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:38 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:38 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:38.404 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:38 compute-1 podman[250157]: 2026-01-23 10:40:38.703008391 +0000 UTC m=+0.101754453 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Jan 23 10:40:39 compute-1 ceph-mon[80126]: pgmap v1429: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:39 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:40 compute-1 sshd-session[250184]: Invalid user sol from 45.148.10.240 port 45856
Jan 23 10:40:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:40.227 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:40 compute-1 sshd-session[250184]: Connection closed by invalid user sol 45.148.10.240 port 45856 [preauth]
Jan 23 10:40:40 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:40 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:40 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:40.406 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:41 compute-1 nova_compute[225705]: 2026-01-23 10:40:41.146 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:41 compute-1 ceph-mon[80126]: pgmap v1430: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:41 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:42 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:42 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:42.230 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:42 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:42 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:42 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:42.408 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:42 compute-1 nova_compute[225705]: 2026-01-23 10:40:42.593 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:43 compute-1 ceph-mon[80126]: pgmap v1431: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:43 compute-1 sshd-session[250187]: Accepted publickey for zuul from 192.168.122.10 port 43086 ssh2: ECDSA SHA256:VirhpRcIg3eaQ2of1D68YV1JVeFZwgFg3WdbJHtted4
Jan 23 10:40:43 compute-1 systemd-logind[807]: New session 58 of user zuul.
Jan 23 10:40:43 compute-1 systemd[1]: Started Session 58 of User zuul.
Jan 23 10:40:43 compute-1 sshd-session[250187]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by zuul(uid=0)
Jan 23 10:40:43 compute-1 sudo[250192]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c 'rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt'
Jan 23 10:40:43 compute-1 sudo[250192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Jan 23 10:40:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:44.234 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:44 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:44 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:44 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:44.410 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:44 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:45 compute-1 ceph-mon[80126]: pgmap v1432: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:46 compute-1 nova_compute[225705]: 2026-01-23 10:40:46.150 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:46.238 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:46 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:46 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:40:46 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:46.413 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:40:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:46 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:47 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:47 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:40:47 compute-1 ceph-mon[80126]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3000.0 total, 600.0 interval
                                           Cumulative writes: 8038 writes, 42K keys, 8038 commit groups, 1.0 writes per commit group, ingest: 0.10 GB, 0.03 MB/s
                                           Cumulative WAL: 8038 writes, 8038 syncs, 1.00 writes per sync, written: 0.10 GB, 0.03 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1508 writes, 7661 keys, 1508 commit groups, 1.0 writes per commit group, ingest: 17.28 MB, 0.03 MB/s
                                           Interval WAL: 1508 writes, 1508 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   1.0      0.0     39.5      1.53              0.19        25    0.061       0      0       0.0       0.0
                                             L6      1/0   11.20 MB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   5.0    102.4     88.0      3.44              0.88        24    0.143    146K    13K       0.0       0.0
                                            Sum      1/0   11.20 MB   0.0      0.3     0.1      0.3       0.4      0.1       0.0   6.0     70.8     73.1      4.97              1.07        49    0.101    146K    13K       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.1     0.0      0.1       0.1      0.0       0.0   8.7     50.2     48.9      2.07              0.32        14    0.148     51K   4033       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Low      0/0    0.00 KB   0.0      0.3     0.1      0.3       0.3      0.0       0.0   0.0    102.4     88.0      3.44              0.88        24    0.143    146K    13K       0.0       0.0
                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.1      0.1       0.0   0.0      0.0     39.6      1.53              0.19        24    0.064       0      0       0.0       0.0
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.8      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 3000.0 total, 600.0 interval
                                           Flush(GB): cumulative 0.059, interval 0.011
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.35 GB write, 0.12 MB/s write, 0.34 GB read, 0.12 MB/s read, 5.0 seconds
                                           Interval compaction: 0.10 GB write, 0.17 MB/s write, 0.10 GB read, 0.17 MB/s read, 2.1 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x563e79a9f350#2 capacity: 304.00 MB usage: 31.84 MB table_size: 0 occupancy: 18446744073709551615 collections: 6 last_copies: 0 last_secs: 0.000337 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1898,30.76 MB,10.1171%) FilterBlock(49,427.73 KB,0.137404%) IndexBlock(49,682.17 KB,0.219139%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
Jan 23 10:40:47 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "status"} v 0)
Jan 23 10:40:47 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/390486485' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:40:47 compute-1 nova_compute[225705]: 2026-01-23 10:40:47.646 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:47 compute-1 ceph-mon[80126]: pgmap v1433: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:47 compute-1 ceph-mon[80126]: from='client.27121 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:47 compute-1 ceph-mon[80126]: from='client.27002 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:47 compute-1 ceph-mon[80126]: from='client.17607 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:47 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2902829741' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:40:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:48.241 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:48 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:48 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:48 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:48.416 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:49 compute-1 ceph-mon[80126]: from='client.27130 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:49 compute-1 ceph-mon[80126]: from='client.17613 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:49 compute-1 ceph-mon[80126]: from='client.27008 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/390486485' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:40:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1900352961' entity='client.admin' cmd=[{"prefix": "status"}]: dispatch
Jan 23 10:40:49 compute-1 ceph-mon[80126]: pgmap v1434: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:49 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1287907540' entity='client.openstack' cmd=[{"prefix":"df", "format":"json"}]: dispatch
Jan 23 10:40:49 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:50.244 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:50 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:50 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:50 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:50.418 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:50 compute-1 ceph-mon[80126]: from='client.? 192.168.122.10:0/1287907540' entity='client.openstack' cmd=[{"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"}]: dispatch
Jan 23 10:40:50 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:40:51 compute-1 nova_compute[225705]: 2026-01-23 10:40:51.153 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:51 compute-1 ovs-vsctl[250513]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Jan 23 10:40:51 compute-1 ceph-mon[80126]: pgmap v1435: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:51 compute-1 sudo[250551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:40:51 compute-1 sudo[250551]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:40:51 compute-1 sudo[250551]: pam_unix(sudo:session): session closed for user root
Jan 23 10:40:51 compute-1 podman[250575]: 2026-01-23 10:40:51.826321022 +0000 UTC m=+0.069907815 container health_status e6e7618b9afdfd4f45c1de1b6d4cd8465eba825fde049598fd3a091dd014b7e6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:85729a662800e6b42ceb088545fed39a2ac58704b4a37fd540cdef3ebf9e59a2', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Jan 23 10:40:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:51 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:52 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:52 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:52.248 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:52 compute-1 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Jan 23 10:40:52 compute-1 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Jan 23 10:40:52 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:52 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:52 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:52.420 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:52 compute-1 virtqemud[225011]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Jan 23 10:40:52 compute-1 ceph-mon[80126]: pgmap v1436: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:52 compute-1 ceph-mon[80126]: from='client.27163 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:52 compute-1 nova_compute[225705]: 2026-01-23 10:40:52.647 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:52 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: cache status {prefix=cache status} (starting...)
Jan 23 10:40:52 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:53 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: client ls {prefix=client ls} (starting...)
Jan 23 10:40:53 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:53 compute-1 lvm[250910]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Jan 23 10:40:53 compute-1 lvm[250910]: VG ceph_vg0 finished
Jan 23 10:40:53 compute-1 ceph-mon[80126]: from='client.27175 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:53 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2735878044' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:53 compute-1 ceph-mon[80126]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:53 compute-1 ceph-mon[80126]: from='client.17643 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:53 compute-1 ceph-mon[80126]: from='client.27184 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:53 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2326781007' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:53 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2803210379' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:53 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: damage ls {prefix=damage ls} (starting...)
Jan 23 10:40:53 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump loads {prefix=dump loads} (starting...)
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:54 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "report"} v 0)
Jan 23 10:40:54 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1569340402' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:54.250 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:54 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:54 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:54 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:54.423 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:54 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Jan 23 10:40:54 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2991488001' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:54 compute-1 ceph-mon[80126]: pgmap v1437: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.17655 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.27032 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.27199 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4283374481' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2749828607' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.17667 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1569340402' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.? ' entity='client.admin' cmd=[{"prefix": "report"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.27044 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1170917103' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2987158526' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.17679 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1278040154' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2991488001' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1520575926' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: get subtrees {prefix=get subtrees} (starting...)
Jan 23 10:40:54 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:55 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config log"} v 0)
Jan 23 10:40:55 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2517041095' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:40:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:40:55.076 143098 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Jan 23 10:40:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:40:55.078 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Jan 23 10:40:55 compute-1 ovn_metadata_agent[143093]: 2026-01-23 10:40:55.078 143098 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Jan 23 10:40:55 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: ops {prefix=ops} (starting...)
Jan 23 10:40:55 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:55 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Jan 23 10:40:55 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2494977871' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config-key dump"} v 0)
Jan 23 10:40:55 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/758304118' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.27056 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.27244 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1579084197' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1948725629' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2517041095' entity='client.admin' cmd=[{"prefix": "config log"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.27068 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/431388215' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.27256 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.17706 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2275578091' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2494977871' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2891389420' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/758304118' entity='client.admin' cmd=[{"prefix": "config-key dump"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2152170927' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:55 compute-1 ceph-mon[80126]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:56 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: session ls {prefix=session ls} (starting...)
Jan 23 10:40:56 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj Can't run that command on an inactive MDS!
Jan 23 10:40:56 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 10:40:56 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3865666164' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:56 compute-1 ceph-mds[84630]: mds.cephfs.compute-1.bcvzvj asok_command: status {prefix=status} (starting...)
Jan 23 10:40:56 compute-1 nova_compute[225705]: 2026-01-23 10:40:56.157 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:40:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:56.253 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:40:56 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:56 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:56 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:56.424 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:56 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 10:40:56 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1456795587' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:56 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "features"} v 0)
Jan 23 10:40:56 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/883122861' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:40:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:40:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:56 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:40:57 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:40:57 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:40:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 10:40:57 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/365728033' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Jan 23 10:40:57 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3781180657' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: pgmap v1438: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.17721 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4141150626' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.27095 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/858187861' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3865666164' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2474866121' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4002027261' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.27101 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1120773140' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/676404484' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2664983427' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1456795587' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/883122861' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? ' entity='client.admin' cmd=[{"prefix": "features"}]: dispatch
Jan 23 10:40:57 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3387213635' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:40:57 compute-1 nova_compute[225705]: 2026-01-23 10:40:57.649 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:40:57 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 10:40:57 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/823562891' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Jan 23 10:40:58 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3408741616' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:40:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:40:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:40:58.256 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:40:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat"} v 0)
Jan 23 10:40:58 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/458587115' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:40:58 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:40:58 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:40:58 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:40:58.427 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.27295 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.17769 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/365728033' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3781180657' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/246354880' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3056118106' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/529095491' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2687217909' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3547435586' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/823562891' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3096053969' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3182871652' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3408741616' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/458587115' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 10:40:58 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/445151465' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:40:58 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Jan 23 10:40:58 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3276688100' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump"} v 0)
Jan 23 10:40:59 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1212026481' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: pgmap v1439: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.17787 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.27331 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.17808 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.27346 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3124684249' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1271257117' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.17820 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.27361 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/445151465' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3276688100' entity='client.admin' cmd=[{"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/666089249' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.17841 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1254495411' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.27385 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.27185 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1212026481' entity='client.admin' cmd=[{"prefix": "mgr dump"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1596010367' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: from='client.17862 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:40:59 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Jan 23 10:40:59 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1315781543' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:22.206018+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:23.206158+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:24.206450+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022c000 session 0x55a5637525a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64c00 session 0x55a5637523c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:25.206674+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:26.207093+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:27.207477+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:28.207664+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:29.207960+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:30.208154+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:31.208299+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990003 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:32.208561+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:33.208698+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:34.209194+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:35.209396+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82485248 unmapped: 1359872 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 49.582626343s of 49.590423584s, submitted: 1
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:36.209648+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 990135 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:37.209820+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:38.210347+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:39.210572+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:40.210765+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:41.210915+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991647 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:42.211055+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:43.211295+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:44.211452+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:45.211584+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:46.211845+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 992568 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:47.212017+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.072285652s of 12.089330673s, submitted: 4
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:48.212177+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82493440 unmapped: 1351680 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:49.212410+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:50.212574+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:51.212723+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82501632 unmapped: 1343488 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56049b000 session 0x55a560b463c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56016f400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:52.212938+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:53.213117+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:54.213315+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:55.213852+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:56.214178+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:57.214415+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82509824 unmapped: 1335296 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:58.214581+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:08:59.214725+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:00.214882+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:01.215041+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:02.215272+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:03.215424+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:04.215584+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:05.215793+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:06.215995+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:07.216212+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:08.216385+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:09.216641+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:10.216884+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:11.217050+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:12.217291+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:13.217585+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:14.217741+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:15.217879+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:16.218113+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:17.218247+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82518016 unmapped: 1327104 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:18.218437+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:19.218638+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:20.218858+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82526208 unmapped: 1318912 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:21.219060+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Cumulative writes: 9060 writes, 35K keys, 9060 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.02 MB/s
                                           Cumulative WAL: 9060 writes, 1959 syncs, 4.62 writes per sync, written: 0.02 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 781 writes, 1248 keys, 781 commit groups, 1.0 writes per commit group, ingest: 0.41 MB, 0.00 MB/s
                                           Interval WAL: 781 writes, 366 syncs, 2.13 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                           
                                           ** Compaction Stats [default] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      2/0    2.63 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Sum      2/0    2.63 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [default] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      4.01              0.00         1    4.007       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 4.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [default] **
                                           
                                           ** Compaction Stats [m-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-0] **
                                           
                                           ** Compaction Stats [m-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-1] **
                                           
                                           ** Compaction Stats [m-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [m-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [m-2] **
                                           
                                           ** Compaction Stats [p-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.56 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Sum      1/0    1.56 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.05              0.00         1    0.046       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-0] **
                                           
                                           ** Compaction Stats [p-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-1] **
                                           
                                           ** Compaction Stats [p-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [p-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [p-2] **
                                           
                                           ** Compaction Stats [O-0] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-0] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-0] **
                                           
                                           ** Compaction Stats [O-1] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-1] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-1] **
                                           
                                           ** Compaction Stats [O-2] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      1/0    1.25 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Sum      1/0    1.25 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [O-2] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e9309b0#2 capacity: 224.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,8.85555e-05%) FilterBlock(1,0.11 KB,4.76837e-05%) IndexBlock(1,0.14 KB,6.13076e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [O-2] **
                                           
                                           ** Compaction Stats [L] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [L] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.002       0      0       0.0       0.0
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [L] **
                                           
                                           ** Compaction Stats [P] **
                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                            Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                           
                                           ** Compaction Stats [P] **
                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                           
                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                           
                                           Uptime(secs): 1204.3 total, 600.0 interval
                                           Flush(GB): cumulative 0.000, interval 0.000
                                           AddFile(GB): cumulative 0.000, interval 0.000
                                           AddFile(Total Files): cumulative 0, interval 0
                                           AddFile(L0 Files): cumulative 0, interval 0
                                           AddFile(Keys): cumulative 0, interval 0
                                           Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                           Block cache BinnedLRUCache@0x55a55e931350#2 capacity: 1.12 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 7.5e-05 secs_since: 0
                                           Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,0.000120534%) FilterBlock(3,0.33 KB,2.78155e-05%) IndexBlock(3,0.34 KB,2.914e-05%) Misc(1,0.00 KB,0%)
                                           
                                           ** File Read Latency Histogram By Level [P] **
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:22.219235+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:23.219436+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:24.219627+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:25.219840+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:26.220108+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:27.220325+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:28.220599+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:29.220752+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:30.232552+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:31.232718+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:32.232852+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:33.232994+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:34.233339+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:35.233652+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:36.234003+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:37.234185+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:38.234343+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:39.234527+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:40.234726+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82558976 unmapped: 1286144 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:41.234872+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:42.235048+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82575360 unmapped: 1269760 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:43.235190+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:44.235347+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:45.235535+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:46.235747+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:47.235876+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:48.236002+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:49.236131+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:50.236252+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:51.236374+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:52.236561+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:53.236728+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:54.237105+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:55.237323+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:56.237509+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:57.237693+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:58.237870+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:09:59.238116+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:00.238249+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:01.238389+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:02.238564+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:03.238748+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:04.238897+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:05.239119+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:06.239317+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:07.239572+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:08.239719+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:09.239869+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82583552 unmapped: 1261568 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:10.240015+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56022d400 session 0x55a560ef6780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d65000 session 0x55a5601c9c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:11.242549+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:12.242701+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:13.242892+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:14.243197+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:15.243441+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:16.243777+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:17.243990+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:18.244115+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:19.244264+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:20.244401+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:21.244571+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 991845 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:22.244724+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82591744 unmapped: 1253376 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 95.210830688s of 95.219345093s, submitted: 2
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:23.244976+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,0,1])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:24.245121+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:25.245295+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:26.245547+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82599936 unmapped: 1245184 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995001 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:27.245785+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:28.245944+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:29.246090+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:30.246251+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 82608128 unmapped: 1236992 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:31.246592+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:32.246748+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:33.246965+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:34.247139+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83656704 unmapped: 188416 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:35.247357+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:36.247594+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:37.247800+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:38.247963+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:39.248224+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:40.248401+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:41.248660+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:42.248830+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:43.249063+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:44.249257+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:45.249486+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:46.249747+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:47.249995+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:48.250196+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:49.250449+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:50.250640+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83664896 unmapped: 180224 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:51.250852+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:52.251060+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:53.251655+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:54.251824+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83673088 unmapped: 172032 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:55.252050+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:56.252298+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:57.252577+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:58.252766+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:10:59.253021+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:00.253215+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:01.253481+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:02.253700+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:03.253925+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:04.254115+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:05.254380+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:06.254592+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:07.254800+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:08.255020+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:09.255298+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:10.255577+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:11.255859+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:12.256029+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:13.256212+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:14.256373+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83681280 unmapped: 163840 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:15.256560+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:16.256822+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:17.257105+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:18.257281+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:19.257471+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:20.257792+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:21.258088+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:22.258274+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:23.258466+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:24.258678+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:25.258877+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:26.259102+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:27.259292+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:28.259536+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:29.259678+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:30.259845+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:31.260016+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:32.260176+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:33.260367+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:34.260598+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:35.260862+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:36.261255+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:37.261581+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:38.261848+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:39.262074+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:40.262246+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83689472 unmapped: 155648 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:41.262443+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:42.262618+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:43.262775+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:44.262939+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:45.263082+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83697664 unmapped: 147456 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:46.263265+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:47.263429+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:48.263628+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:49.263812+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:50.264025+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:51.264160+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:52.264379+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a560b61400 session 0x55a560c23a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a55fd2bc00 session 0x55a560b46780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:53.264588+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:54.264771+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d56000 session 0x55a56370c000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d64000 session 0x55a560223860
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:55.264927+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:56.265155+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:57.265378+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83714048 unmapped: 131072 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:58.265569+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83722240 unmapped: 122880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:11:59.265771+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83722240 unmapped: 122880 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:00.265949+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:01.266114+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:02.266289+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995199 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:03.266450+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 99.385841370s of 100.670951843s, submitted: 7
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83730432 unmapped: 114688 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:04.266615+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83787776 unmapped: 57344 heap: 83845120 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:05.266810+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,1,0,1])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83861504 unmapped: 1032192 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:06.267172+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 83869696 unmapped: 1024000 heap: 84893696 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:07.267395+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995535 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [0,0,0,0,0,0,0,3])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:08.267550+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:09.267762+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84926464 unmapped: 1015808 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:10.267932+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84934656 unmapped: 1007616 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:11.268129+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 84942848 unmapped: 999424 heap: 85942272 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:12.268284+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995463 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:13.268427+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 3.510344267s of 10.329211235s, submitted: 399
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:14.268567+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:15.268709+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:16.268870+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:17.268991+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994281 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:18.269131+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:19.269379+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:20.269546+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:21.269752+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:22.269910+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:23.270331+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:24.270543+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:25.270739+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:26.270953+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:27.271166+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:28.271373+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:29.271577+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85245952 unmapped: 1744896 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:30.271743+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:31.271937+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:32.272151+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:33.272369+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:34.272578+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:35.272761+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:36.273019+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:37.273203+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85254144 unmapped: 1736704 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:38.273381+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:39.273569+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:40.273730+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85262336 unmapped: 1728512 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:41.273903+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:42.274115+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:43.274300+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a561d67400 session 0x55a5602190e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 ms_handle_reset con 0x55a56049b000 session 0x55a562f18960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:44.274432+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:45.274609+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:46.274816+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:47.274992+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:48.275176+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:49.275386+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:50.275575+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:51.275784+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:52.276000+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994017 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:53.276260+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.042835236s of 40.052230835s, submitted: 3
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:54.277013+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85270528 unmapped: 1720320 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:55.277230+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:56.277567+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:57.277775+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 994149 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:58.277986+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:12:59.278254+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:00.278470+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d65000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:01.278760+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:02.278996+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995661 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:03.279165+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:04.279431+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:05.279708+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:06.279998+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:07.280241+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995661 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.938447952s of 13.945899963s, submitted: 2
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:08.280427+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:09.280683+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:10.280879+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:11.281022+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:12.281179+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:13.281426+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:14.281618+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:15.281802+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:16.281999+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:17.282186+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:18.282594+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:19.282768+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:20.282925+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:21.283078+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:22.283238+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:23.283403+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:24.283544+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:25.283680+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:26.283861+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:27.284055+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:28.284281+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:29.284481+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:30.284746+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:31.284916+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:32.285134+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:33.285391+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:34.285620+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:35.285836+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:36.286101+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:37.286303+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85278720 unmapped: 1712128 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:38.286551+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:39.286787+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:40.286982+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:41.287146+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:42.287792+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:43.290689+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:44.291793+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:45.292469+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:46.293558+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:47.294282+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:48.294575+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:49.295690+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:50.296291+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:51.296889+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:52.297092+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:53.297276+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:54.297601+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:55.297760+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:56.298028+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:57.298236+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:58.298444+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:13:59.298691+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:00.298863+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:01.299168+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:02.299404+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:03.299585+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:04.299750+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:05.299908+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:06.300111+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:07.300477+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:08.300910+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:09.301320+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:10.301440+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:11.301743+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:12.301936+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:13.302151+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:14.302346+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:15.302531+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:16.302847+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:17.303120+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:18.303406+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:19.303643+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:20.303819+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:21.304081+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:22.304256+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85286912 unmapped: 1703936 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:23.304409+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:24.304597+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:25.304748+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:26.304954+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:27.305252+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:28.305537+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:29.305801+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:30.305976+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:31.306146+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:32.306371+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:33.306581+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:34.306826+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:35.307015+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:36.307244+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:37.307617+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:38.307858+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:39.308165+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:40.308328+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:41.308593+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85295104 unmapped: 1695744 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:42.308874+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:43.309160+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:44.309307+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:45.309581+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:46.309782+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:47.309954+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:48.310145+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:49.310331+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:50.310564+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:51.310797+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:52.310996+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:53.311183+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85303296 unmapped: 1687552 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:54.311367+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:55.311813+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:56.312036+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:57.312151+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:58.312299+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:14:59.312451+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:00.312603+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:01.312789+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:02.312972+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:03.313200+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 995529 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:04.313397+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:05.313565+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:06.313828+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:07.314036+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85311488 unmapped: 1679360 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 heartbeat osd_stat(store_statfs(0x4fc5f4000/0x0/0x4ffc00000, data 0x15b9c4/0x218000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 138 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 120.149322510s of 120.153068542s, submitted: 1
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:08.314191+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 999295 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85319680 unmapped: 1671168 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _renew_subs
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:09.314330+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 1622016 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:10.314591+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _renew_subs
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 141 ms_handle_reset con 0x55a56022d800 session 0x55a560b46000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85368832 unmapped: 1622016 heap: 86990848 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:11.314757+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fc5e9000/0x0/0x4ffc00000, data 0x161cf8/0x221000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:12.314950+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 141 ms_handle_reset con 0x55a56049b000 session 0x55a563595a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:13.315134+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1144302 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 141 heartbeat osd_stat(store_statfs(0x4fb179000/0x0/0x4ffc00000, data 0x15d1cf8/0x1691000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 141 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:14.315313+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85557248 unmapped: 18219008 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:15.315467+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:16.316196+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:17.316387+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:18.316696+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85573632 unmapped: 18202624 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:19.316893+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:20.317037+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:21.317241+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:22.317904+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:23.318040+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:24.318728+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:25.318891+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:26.319356+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:27.320615+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:28.321322+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a560b61400 session 0x55a563594000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a55fd2bc00 session 0x55a562f1fc20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:29.321626+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:30.322043+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:31.322181+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:32.322373+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:33.322678+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:34.322827+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:35.323042+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:36.323354+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:37.323530+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85581824 unmapped: 18194432 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:38.323707+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150154 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56022d400 session 0x55a56387fa40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:39.323852+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 30.783803940s of 31.204965591s, submitted: 49
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:40.324146+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:41.324388+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:42.324546+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:43.324795+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150286 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:44.324998+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb173000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85590016 unmapped: 18186240 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:45.325255+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d65000 session 0x55a560e963c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d64c00 session 0x55a5627854a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2bc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:46.325539+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:47.325794+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:48.326080+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1150194 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 heartbeat osd_stat(store_statfs(0x4fb175000/0x0/0x4ffc00000, data 0x15d5dd2/0x1697000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:49.326268+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56049b000 session 0x55a560a07a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a560b61400 session 0x55a560b472c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d64000 session 0x55a560e912c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85598208 unmapped: 18178048 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:50.326537+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.051367760s of 11.062813759s, submitted: 3
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a561d67400 session 0x55a560e96f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022c400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56022c400 session 0x55a563594b40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85614592 unmapped: 18161664 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:51.326755+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 ms_handle_reset con 0x55a56049b000 session 0x55a5635941e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _renew_subs
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 85639168 unmapped: 18137088 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:52.326957+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _renew_subs
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87097344 unmapped: 16678912 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:53.327137+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a560b61400 session 0x55a562e5e960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d64000 session 0x55a56387e5a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d67400 session 0x55a5637521e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561cd6000 session 0x55a560219e00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1187745 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a56049b000 session 0x55a560e914a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87097344 unmapped: 16678912 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:54.327325+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:55.327565+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:56.327801+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87130112 unmapped: 16646144 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:57.327985+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87138304 unmapped: 16637952 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 ms_handle_reset con 0x55a561d64000 session 0x55a56387fc20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 heartbeat osd_stat(store_statfs(0x4faeb9000/0x0/0x4ffc00000, data 0x188b083/0x1951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:58.328231+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1186624 data_alloc: 218103808 data_used: 303104
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 87146496 unmapped: 16629760 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:15:59.328436+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89415680 unmapped: 14360576 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:00.328607+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89473024 unmapped: 14303232 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 145 handle_osd_map epochs [145,146], i have 145, src has [1,146]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.645101547s of 10.825790405s, submitted: 53
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:01.328751+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89497600 unmapped: 14278656 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:02.328953+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89497600 unmapped: 14278656 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:03.329197+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209371 data_alloc: 218103808 data_used: 3031040
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:04.329394+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:05.329559+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:06.329772+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:07.329955+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4faeb6000/0x0/0x4ffc00000, data 0x188d078/0x1955000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89513984 unmapped: 14262272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:08.330117+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1209371 data_alloc: 218103808 data_used: 3031040
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:09.330279+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:10.330693+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 89530368 unmapped: 14245888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:11.330871+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.822414398s of 10.054231644s, submitted: 18
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93847552 unmapped: 9928704 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:12.331097+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93847552 unmapped: 9928704 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:13.331271+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa929000/0x0/0x4ffc00000, data 0x1e13078/0x1edb000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253507 data_alloc: 218103808 data_used: 3325952
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92520448 unmapped: 11255808 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:14.331475+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92520448 unmapped: 11255808 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:15.331662+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:16.331923+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:17.332129+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:18.332387+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253507 data_alloc: 218103808 data_used: 3325952
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:19.332624+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:20.332907+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:21.333084+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:22.333302+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:23.333601+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:24.333826+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:25.334032+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:26.334281+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:27.334516+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:28.334769+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:29.334987+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:30.335171+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:31.335403+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:32.335647+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:33.335904+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:34.336162+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:35.336331+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:36.336540+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:37.336717+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:38.336905+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92585984 unmapped: 11190272 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3cc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a56327b0e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3dc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3dc00 session 0x55a563595e00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d69400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a562f192c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253523 data_alloc: 218103808 data_used: 3325952
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:39.337081+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92602368 unmapped: 11173888 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d69400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 27.268079758s of 28.147586823s, submitted: 54
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a56021e780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef6d20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:40.337262+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 92594176 unmapped: 11182080 heap: 103776256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a562785680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa92f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3cc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:41.337444+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a55f9194a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3dc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3dc00 session 0x55a562e5fe00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562e5f2c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a5602214a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d69400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a560222f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:42.337677+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:43.337867+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:44.338055+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:45.338226+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:46.338451+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:47.338644+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:48.338872+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:49.339101+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:50.339326+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:51.339624+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93675520 unmapped: 20602880 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:52.339919+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9edf000/0x0/0x4ffc00000, data 0x2864088/0x292d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:53.340086+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93683712 unmapped: 20594688 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1336146 data_alloc: 218103808 data_used: 3334144
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:54.340263+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93290496 unmapped: 20987904 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:55.340672+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93290496 unmapped: 20987904 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:56.341013+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 93298688 unmapped: 20979712 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3cc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.620344162s of 17.155050278s, submitted: 29
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a560b99c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562cc1800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:57.341156+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 94502912 unmapped: 19775488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:58.341318+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100663296 unmapped: 13615104 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413079 data_alloc: 234881024 data_used: 13975552
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:16:59.341627+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:00.341850+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:01.342114+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:02.342314+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:03.342606+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104710144 unmapped: 9568256 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1413079 data_alloc: 234881024 data_used: 13975552
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:04.342848+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:05.343078+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:06.343423+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:07.343663+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104742912 unmapped: 9535488 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:08.343878+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9ebb000/0x0/0x4ffc00000, data 0x2888088/0x2951000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x33ef9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 104775680 unmapped: 9502720 heap: 114278400 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.021077156s of 12.051360130s, submitted: 8
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1439531 data_alloc: 234881024 data_used: 14000128
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:09.344100+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111493120 unmapped: 9658368 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:10.344365+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112254976 unmapped: 8896512 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4e000/0x0/0x4ffc00000, data 0x3655088/0x371e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:11.344622+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:12.344909+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:13.345225+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1517819 data_alloc: 234881024 data_used: 14286848
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:14.345489+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112525312 unmapped: 8626176 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:15.345734+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:16.345975+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4c000/0x0/0x4ffc00000, data 0x3657088/0x3720000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:17.346265+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112533504 unmapped: 8617984 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:18.346485+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112541696 unmapped: 8609792 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 9.255970955s of 10.148886681s, submitted: 105
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc1800 session 0x55a562e563c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a56370c780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1513215 data_alloc: 234881024 data_used: 14286848
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:19.346650+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f7f4c000/0x0/0x4ffc00000, data 0x3657088/0x3720000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103571456 unmapped: 17580032 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a563595680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:20.346878+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:21.347129+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:22.347361+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:23.347676+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267213 data_alloc: 218103808 data_used: 3334144
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:24.347956+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:25.348224+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f978f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:26.348557+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:27.348742+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:28.348967+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103243776 unmapped: 17907712 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1267213 data_alloc: 218103808 data_used: 3334144
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.262123108s of 10.403896332s, submitted: 50
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:29.349172+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560c245a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5602192c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 103260160 unmapped: 17891328 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f978f000/0x0/0x4ffc00000, data 0x1e15078/0x1edd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,1])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64000 session 0x55a56103c5a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:30.349344+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:31.349583+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:32.349809+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560ef61e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562f1f680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:33.350016+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184030 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:34.350240+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:35.350487+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:36.350810+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:37.351035+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:38.351247+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184030 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:39.351450+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:40.351681+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:41.351938+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:42.352115+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:43.352374+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.069961548s of 14.267497063s, submitted: 33
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184162 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:44.352564+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:45.352807+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:46.353076+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:47.353284+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:48.353567+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184162 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fc9000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:49.353806+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:50.354128+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:51.354337+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 19980288 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:52.354540+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a55fee6f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5601c9c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d69400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d69400 session 0x55a5601c9680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:53.354811+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.291867256s of 10.369996071s, submitted: 3
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a56327b860
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1184631 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560a9fa40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:54.355046+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100122624 unmapped: 21028864 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a55fee7a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:55.355247+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 100130816 unmapped: 21020672 heap: 121151488 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a560219a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3cc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3cc00 session 0x55a5630701e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562e56780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:56.355458+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a5630d8f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560ef6780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:57.355659+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5630dad20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aef000/0x0/0x4ffc00000, data 0x1ab6ff3/0x1b7d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:58.355838+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54000 session 0x55a56103d4a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1226346 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:17:59.356087+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560eef2c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99860480 unmapped: 29687808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:00.356327+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a560eefe00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99868672 unmapped: 29679616 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d67400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:01.356489+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 99893248 unmapped: 29655040 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:02.356671+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:03.356901+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262160 data_alloc: 218103808 data_used: 5316608
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:04.357044+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:05.357174+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101163008 unmapped: 28385280 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:06.357385+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:07.357648+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:08.357865+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1262160 data_alloc: 218103808 data_used: 5316608
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:09.358124+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101171200 unmapped: 28377088 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9aee000/0x0/0x4ffc00000, data 0x1ab7003/0x1b7e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:10.358329+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101187584 unmapped: 28360704 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:11.358622+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101187584 unmapped: 28360704 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:12.358838+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 18.546638489s of 18.645618439s, submitted: 27
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102637568 unmapped: 26910720 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562e561e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560b990e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:13.358991+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105865216 unmapped: 23683072 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1300302 data_alloc: 218103808 data_used: 6565888
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:14.359170+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105865216 unmapped: 23683072 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9635000/0x0/0x4ffc00000, data 0x1f6a003/0x2031000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,10])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:15.359469+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 106921984 unmapped: 22626304 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:16.359770+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 22519808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:17.360012+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107028480 unmapped: 22519808 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:18.360230+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310702 data_alloc: 218103808 data_used: 6471680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:19.360425+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:20.360588+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:21.360759+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:22.360961+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107069440 unmapped: 22478848 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:23.361164+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3d000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.670749664s of 11.247441292s, submitted: 64
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:24.361384+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:25.361571+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:26.361791+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107077632 unmapped: 22470656 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:27.361987+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:28.362169+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:29.362394+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:30.362574+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107085824 unmapped: 22462464 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:31.362767+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107118592 unmapped: 22429696 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:32.362924+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107118592 unmapped: 22429696 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:33.363214+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9613000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107151360 unmapped: 22396928 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1310834 data_alloc: 218103808 data_used: 6471680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:34.363452+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107151360 unmapped: 22396928 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:35.363625+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.062977791s of 12.069572449s, submitted: 1
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:36.363800+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:37.363963+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:38.364153+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105586688 unmapped: 23961600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1302567 data_alloc: 218103808 data_used: 6483968
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:39.364371+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:40.364571+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d67400 session 0x55a560eef0e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:41.364787+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5603c8b40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:42.365006+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 105594880 unmapped: 23953408 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9621000/0x0/0x4ffc00000, data 0x1f84003/0x204b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,1])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:43.365219+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102514688 unmapped: 27033600 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:44.365404+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a562f1e5a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:45.365622+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:46.365799+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:47.366019+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:48.366173+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:49.366378+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:50.366682+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:51.367013+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:52.367201+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:53.367426+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:54.367679+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:55.367941+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:56.368240+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:57.368466+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:58.368701+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:18:59.368942+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:00.369152+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:01.369347+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:02.369590+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:03.369831+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:04.369963+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:05.370184+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:06.370361+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:07.370610+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:08.370820+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:09.370997+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1192404 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:10.371148+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563070780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9fcb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5630703c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b61400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560b61400 session 0x55a563070f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5630705a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:11.371325+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101097472 unmapped: 28450816 heap: 129548288 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:12.371596+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 34.487091064s of 36.540157318s, submitted: 34
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560ef61e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef63c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a560ef7e00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c55800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c55800 session 0x55a560ef6960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560ef6000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:13.371776+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:14.372049+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1275342 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:15.372221+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102113280 unmapped: 31113216 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:16.372542+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:17.372719+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:18.372955+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102195200 unmapped: 31031296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:19.373097+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1276816 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562c57c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 101695488 unmapped: 31531008 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:20.373255+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 102670336 unmapped: 30556160 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:21.373392+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 1804.3 total, 600.0 interval
                                           Cumulative writes: 10K writes, 39K keys, 10K commit groups, 1.0 writes per commit group, ingest: 0.03 GB, 0.01 MB/s
                                           Cumulative WAL: 10K writes, 2684 syncs, 4.00 writes per sync, written: 0.03 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 1689 writes, 4700 keys, 1689 commit groups, 1.0 writes per commit group, ingest: 4.62 MB, 0.01 MB/s
                                           Interval WAL: 1689 writes, 725 syncs, 2.33 writes per sync, written: 0.00 GB, 0.01 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:22.373563+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:23.373774+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:24.373926+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350139 data_alloc: 234881024 data_used: 11247616
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:25.374180+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:26.374440+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:27.374617+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:28.374771+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:29.374933+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350139 data_alloc: 234881024 data_used: 11247616
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:30.375142+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:31.375357+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:32.375606+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:33.375740+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:34.375949+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1350291 data_alloc: 234881024 data_used: 11251712
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:35.376117+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9515000/0x0/0x4ffc00000, data 0x2091fe3/0x2157000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.624202728s of 23.833007812s, submitted: 21
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107888640 unmapped: 25337856 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:36.376344+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112435200 unmapped: 20791296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:37.376602+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112435200 unmapped: 20791296 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:38.376763+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111206400 unmapped: 22020096 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:39.376956+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412875 data_alloc: 234881024 data_used: 11755520
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8cba000/0x0/0x4ffc00000, data 0x28ecfe3/0x29b2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [0,0,0,1])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111697920 unmapped: 21528576 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:40.377098+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113246208 unmapped: 19980288 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:41.377247+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113278976 unmapped: 19947520 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:42.377416+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113278976 unmapped: 19947520 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:43.377631+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113319936 unmapped: 19906560 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:44.377794+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1433231 data_alloc: 234881024 data_used: 12496896
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113319936 unmapped: 19906560 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:45.377982+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c24000/0x0/0x4ffc00000, data 0x297afe3/0x2a40000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:46.378188+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:47.378367+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:48.378533+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:49.378703+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1428407 data_alloc: 234881024 data_used: 12496896
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:50.378926+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.531507492s of 14.888542175s, submitted: 102
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:51.379071+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29a3fe3/0x2a69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c03000/0x0/0x4ffc00000, data 0x29a3fe3/0x2a69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:52.379205+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:53.379340+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:54.379506+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1428511 data_alloc: 234881024 data_used: 12496896
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29a4fe3/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:55.379656+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:56.379914+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c02000/0x0/0x4ffc00000, data 0x29a4fe3/0x2a6a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:57.380093+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113074176 unmapped: 20152320 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:58.380283+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120709120 unmapped: 12517376 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:19:59.380458+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212800 session 0x55a560e90d20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560c22000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd9000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd9000 session 0x55a560b47860
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1484761 data_alloc: 234881024 data_used: 12496896
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560c24000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5630710e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:00.380604+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560b463c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:01.380776+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212800 session 0x55a562785a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113311744 unmapped: 19914752 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:02.380930+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562214800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214800 session 0x55a55fee70e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.321987152s of 11.643644333s, submitted: 14
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a563594960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f84b7000/0x0/0x4ffc00000, data 0x30effe3/0x31b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 19611648 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:03.381076+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113614848 unmapped: 19611648 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:04.381196+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1503794 data_alloc: 234881024 data_used: 14667776
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:05.381340+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:06.381582+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:07.381768+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118800384 unmapped: 14426112 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:08.381933+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:09.382117+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 14409728 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532370 data_alloc: 234881024 data_used: 18919424
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:10.382280+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118816768 unmapped: 14409728 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:11.382450+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:12.382659+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:13.382845+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8492000/0x0/0x4ffc00000, data 0x3114006/0x31da000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x458f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:14.382967+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 118841344 unmapped: 14385152 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1532658 data_alloc: 234881024 data_used: 18923520
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.284761429s of 12.309606552s, submitted: 7
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:15.383041+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122421248 unmapped: 10805248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:16.383180+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122421248 unmapped: 10805248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:17.383365+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121954304 unmapped: 11272192 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:18.383597+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:19.383793+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1649774 data_alloc: 234881024 data_used: 19283968
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:20.383953+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:21.384064+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 11239424 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72c2000/0x0/0x4ffc00000, data 0x3ec5006/0x3f8b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:22.384227+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:23.384393+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72cf000/0x0/0x4ffc00000, data 0x3ec7006/0x3f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f72cf000/0x0/0x4ffc00000, data 0x3ec7006/0x3f8d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:24.384620+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1644310 data_alloc: 234881024 data_used: 19283968
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:25.384846+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121397248 unmapped: 11829248 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a55fee7c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a56103c000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a5603c9a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d000 session 0x55a5601c8000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.608616829s of 10.889985085s, submitted: 103
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:26.385056+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5601cbe00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:27.385206+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:28.385415+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87ee000/0x0/0x4ffc00000, data 0x29a7fe3/0x2a6d000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:29.385610+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1442224 data_alloc: 234881024 data_used: 12496896
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a563070960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55fee6f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:30.387571+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:31.387736+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117907456 unmapped: 15319040 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a561048d20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:32.387894+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:33.388045+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:34.388220+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1217850 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:35.388355+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:36.388598+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108363776 unmapped: 24862720 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.357207298s of 10.545221329s, submitted: 64
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:37.388779+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:38.388950+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:39.389137+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108371968 unmapped: 24854528 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1219494 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:40.392112+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:41.394785+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:42.395027+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108380160 unmapped: 24846336 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:43.396795+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:44.398246+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1221006 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:45.399370+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:46.400321+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:47.400751+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:48.401359+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.268618584s of 12.279949188s, submitted: 3
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:49.401889+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:50.402331+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:51.402723+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:52.402945+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:53.403442+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:54.404145+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:55.404830+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:56.405475+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:57.406105+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:58.406653+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:20:59.407161+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:00.407575+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:01.408008+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:02.408233+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:03.408427+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a5623ae400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:04.408600+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108396544 unmapped: 24829952 heap: 133226496 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1220874 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.385444641s of 16.389841080s, submitted: 1
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:05.408743+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:06.409048+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:07.409283+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107470848 unmapped: 33103872 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5623ae400 session 0x55a560ef7a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5635954a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5630714a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:08.409440+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5630705a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a5630703c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:09.409745+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1286318 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:10.409900+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107495424 unmapped: 33079296 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:11.410072+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107503616 unmapped: 33071104 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:12.410229+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107511808 unmapped: 33062912 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a560eeed20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:13.412346+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 107511808 unmapped: 33062912 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:14.412644+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 106987520 unmapped: 33587200 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1288671 data_alloc: 218103808 data_used: 393216
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:15.412761+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108208128 unmapped: 32366592 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:16.414594+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:17.414796+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:18.414987+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108216320 unmapped: 32358400 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:19.415205+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347647 data_alloc: 218103808 data_used: 8814592
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:20.415359+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:21.415534+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f931e000/0x0/0x4ffc00000, data 0x1e78045/0x1f3e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:22.415717+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:23.415858+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a560eef4a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:24.416012+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 108224512 unmapped: 32350208 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1347647 data_alloc: 218103808 data_used: 8814592
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:25.416181+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 19.396499634s of 20.359004974s, submitted: 48
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111820800 unmapped: 28753920 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:26.416442+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115318784 unmapped: 25255936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:27.416604+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c5f000/0x0/0x4ffc00000, data 0x2537045/0x25fd000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115449856 unmapped: 25124864 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8c54000/0x0/0x4ffc00000, data 0x2541045/0x2607000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:28.416791+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:29.416988+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1409157 data_alloc: 218103808 data_used: 9031680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:30.417192+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112910336 unmapped: 27664384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:31.417333+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bd3000/0x0/0x4ffc00000, data 0x25c3045/0x2689000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 27574272 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:32.417624+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113000448 unmapped: 27574272 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:33.417849+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113139712 unmapped: 27435008 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:34.418044+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113147904 unmapped: 27426816 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bb2000/0x0/0x4ffc00000, data 0x25e4045/0x26aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408089 data_alloc: 218103808 data_used: 9035776
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:35.418228+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 27418624 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:36.418571+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113156096 unmapped: 27418624 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bb2000/0x0/0x4ffc00000, data 0x25e4045/0x26aa000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:37.418754+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.870034218s of 12.169149399s, submitted: 80
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113180672 unmapped: 27394048 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:38.418981+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113180672 unmapped: 27394048 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:39.419192+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411361 data_alloc: 218103808 data_used: 9035776
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:40.419414+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:41.419690+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113229824 unmapped: 27344896 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:42.419884+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:43.420084+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8bac000/0x0/0x4ffc00000, data 0x25ea045/0x26b0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:44.420242+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1411659 data_alloc: 218103808 data_used: 9043968
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:45.420384+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113238016 unmapped: 27336704 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:46.420565+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113295360 unmapped: 27279360 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:47.420721+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:48.420871+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:49.421072+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1412595 data_alloc: 218103808 data_used: 9043968
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:50.421264+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8ba2000/0x0/0x4ffc00000, data 0x25f4045/0x26ba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:51.421464+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.734275818s of 13.774451256s, submitted: 9
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:52.421713+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113303552 unmapped: 27271168 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:53.421944+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55fee7c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a55fee6f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3d800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a561048d20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d62800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d62800 session 0x55a562c57c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114442240 unmapped: 26132480 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:54.422201+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a562c574a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a560a070e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a563071a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3d800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a5603c9680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c55c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c55c00 session 0x55a560ef7c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114581504 unmapped: 25993216 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x26ec06e/0x27b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1432887 data_alloc: 218103808 data_used: 9048064
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:55.422424+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 25985024 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:56.422694+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 25960448 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa9000/0x0/0x4ffc00000, data 0x26ec0a7/0x27b3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:57.422864+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114614272 unmapped: 25960448 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:58.423112+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113704960 unmapped: 26869760 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a5603c9c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:21:59.423326+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562212000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113721344 unmapped: 26853376 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1434949 data_alloc: 218103808 data_used: 9048064
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:00.423576+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d66000 session 0x55a560223a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562d3d800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:01.423757+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa6000/0x0/0x4ffc00000, data 0x26ed0ca/0x27b5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:02.423962+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114294784 unmapped: 26279936 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.734139442s of 11.903190613s, submitted: 57
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:03.424216+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114343936 unmapped: 26230784 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:04.424378+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114401280 unmapped: 26173440 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440168 data_alloc: 234881024 data_used: 9789440
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:05.424535+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114589696 unmapped: 25985024 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa3000/0x0/0x4ffc00000, data 0x26f10ca/0x27b9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [0,0,1])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:06.424823+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114794496 unmapped: 25780224 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:07.425028+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:08.425226+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:09.425439+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8aa3000/0x0/0x4ffc00000, data 0x26f10ca/0x27b9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1440509 data_alloc: 234881024 data_used: 9789440
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:10.425607+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 114835456 unmapped: 25739264 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:11.425748+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116137984 unmapped: 24436736 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:12.425961+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:13.426200+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:14.426392+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f840e000/0x0/0x4ffc00000, data 0x2d700ca/0x2e38000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117342208 unmapped: 23232512 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1504109 data_alloc: 234881024 data_used: 9850880
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:15.426609+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a560c25680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560ef65a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117350400 unmapped: 23224320 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.155679703s of 12.892781258s, submitted: 488
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:16.426871+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116834304 unmapped: 23740416 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:17.427067+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116834304 unmapped: 23740416 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:18.427235+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:19.427420+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8402000/0x0/0x4ffc00000, data 0x2d920ca/0x2e5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1496421 data_alloc: 234881024 data_used: 9854976
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:20.427619+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8402000/0x0/0x4ffc00000, data 0x2d920ca/0x2e5a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:21.427915+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116957184 unmapped: 23617536 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:22.428275+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116965376 unmapped: 23609344 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:23.428579+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116965376 unmapped: 23609344 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:24.428800+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f83f3000/0x0/0x4ffc00000, data 0x2da10ca/0x2e69000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 23527424 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1496237 data_alloc: 234881024 data_used: 9854976
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:25.428959+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117047296 unmapped: 23527424 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.086823463s of 10.129245758s, submitted: 9
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:26.429154+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a5601c8780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562212000 session 0x55a55f9181e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116473856 unmapped: 24100864 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560c22f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:27.429343+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:28.429635+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:29.429905+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115990528 unmapped: 24584192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1426840 data_alloc: 218103808 data_used: 9109504
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:30.430075+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:31.430250+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b20000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:32.430532+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561cd6000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:33.430747+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:34.430948+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115998720 unmapped: 24576000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:35.431153+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1426840 data_alloc: 218103808 data_used: 9109504
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:36.431429+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:37.431695+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b84000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:38.431992+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 12.182063103s of 12.550888062s, submitted: 72
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:39.432146+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:40.432318+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1424015 data_alloc: 218103808 data_used: 9109504
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a5601ca780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560c24f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 116006912 unmapped: 24567808 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:41.432445+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8b84000/0x0/0x4ffc00000, data 0x2612045/0x26d8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 29704192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a563071680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:42.432608+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:43.432866+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:44.433099+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:45.433308+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:46.433567+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:47.433757+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:48.433920+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:49.434123+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:50.434477+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:51.434877+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:52.435162+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:53.435418+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110845952 unmapped: 29728768 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:54.435793+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:55.436034+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:56.436326+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:57.436540+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:58.436731+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2bc00 session 0x55a5627841e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a562c56000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110854144 unmapped: 29720576 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:22:59.436955+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:00.437120+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:01.437346+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:02.437821+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:03.437980+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:04.438177+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:05.438340+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245523 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:06.438587+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:07.438998+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:08.439212+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110862336 unmapped: 29712384 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:09.439354+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 30.355859756s of 30.541212082s, submitted: 59
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110870528 unmapped: 29704192 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:10.439566+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1245655 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110878720 unmapped: 29696000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:11.439859+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110878720 unmapped: 29696000 heap: 140574720 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562e57680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a56327a5a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a560a07a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:12.439993+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562cc3800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc3800 session 0x55a5603c94a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560a072c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562785e00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a560ef6000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a56370cb40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562cc3800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc3800 session 0x55a560219860
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:13.440295+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:14.440593+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112730112 unmapped: 31514624 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:15.440820+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1339283 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112795648 unmapped: 31449088 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:16.441114+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a560e914a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112803840 unmapped: 31440896 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:17.441361+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112967680 unmapped: 31277056 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:18.441552+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [1])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:19.441787+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:20.441978+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1418779 data_alloc: 234881024 data_used: 12140544
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:21.442206+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 117686272 unmapped: 26558464 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:22.443884+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9001000/0x0/0x4ffc00000, data 0x2194ff2/0x225b000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560b465a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a5635954a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.497446060s of 13.612625122s, submitted: 20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a56021fe00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:23.444096+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:24.444309+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:25.445084+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:26.445526+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:27.445703+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:28.445902+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:29.446414+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:30.446826+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:31.447199+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:32.447453+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:33.447739+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:34.448423+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110477312 unmapped: 33767424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:35.448591+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 33759232 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1253608 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:36.448860+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110485504 unmapped: 33759232 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022dc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.943146706s of 13.990984917s, submitted: 17
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022dc00 session 0x55a56387e780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a562c570e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:37.449076+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5602230e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a563752d20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d64400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d64400 session 0x55a562f19e00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 33554432 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:38.449332+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110690304 unmapped: 33554432 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:39.449524+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9819000/0x0/0x4ffc00000, data 0x197d045/0x1a43000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:40.449708+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1289281 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a563d89000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563d89000 session 0x55a560219a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:41.449934+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110698496 unmapped: 33546240 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561cd6000 session 0x55a562e56f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59800 session 0x55a562f1f680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563752000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a560c24000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:42.450071+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56000 session 0x55a5603c9a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:43.450289+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:44.450473+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111738880 unmapped: 32505856 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:45.450736+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111353856 unmapped: 32890880 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1298244 data_alloc: 218103808 data_used: 1359872
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:46.450970+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:47.451217+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:48.451465+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a563070960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.371298790s of 11.489388466s, submitted: 37
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a56021fa40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111976448 unmapped: 32268288 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9818000/0x0/0x4ffc00000, data 0x197d055/0x1a44000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562581400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:49.451709+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110919680 unmapped: 33325056 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a56017a960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:50.451934+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110936064 unmapped: 33308672 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: mgrc ms_handle_reset ms_handle_reset con 0x55a561cb3c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: mgrc reconnect Terminating session with v2:192.168.122.100:6800/4198923246
Jan 23 10:41:00 compute-1 ceph-osd[77616]: mgrc reconnect Starting new session with [v2:192.168.122.100:6800/4198923246,v1:192.168.122.100:6801/4198923246]
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: get_auth_request con 0x55a561d56000 auth_method 0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: mgrc handle_mgr_configure stats_period=5
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:51.452077+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d68c00 session 0x55a562e5f680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a5621f9800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56016f400 session 0x55a562ed23c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d43400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:52.452475+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:53.452755+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:54.453422+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a55f6d90e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:55.453617+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:56.453780+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:57.454014+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:58.454837+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:23:59.455068+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:00.455353+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260183 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:01.455617+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:02.456031+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:03.456205+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:04.456619+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:05.456756+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560b60c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 16.596210480s of 16.945894241s, submitted: 37
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1260315 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:06.457254+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:07.457414+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:08.457712+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:09.457892+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:10.458069+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261695 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:11.458218+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:12.458435+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:13.458655+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:14.458963+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:15.459101+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261695 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:16.459362+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111116288 unmapped: 33128448 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:17.459541+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111124480 unmapped: 33120256 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:18.459688+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111124480 unmapped: 33120256 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:19.459930+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.821660995s of 13.956790924s, submitted: 3
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:20.460330+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261563 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:21.460478+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:22.460724+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111132672 unmapped: 33112064 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:23.460917+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:24.461147+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:25.461361+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1261563 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:26.461607+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:27.461746+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58400 session 0x55a5601ca3c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560b46780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a55f919860
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bbb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a562c563c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562581400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a56370cd20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:28.462062+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:29.462303+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:30.462618+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a563d89800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563d89800 session 0x55a5630dad20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1303917 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111296512 unmapped: 32948224 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96ee000/0x0/0x4ffc00000, data 0x1aa8045/0x1b6e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:31.462772+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111304704 unmapped: 32940032 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a560179400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a560179400 session 0x55a560a9fa40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:32.462949+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56022d400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56022d400 session 0x55a5601ca3c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56049b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.447840691s of 13.576416016s, submitted: 37
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111329280 unmapped: 32915456 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:33.463094+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:34.463374+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56049b000 session 0x55a5601caf00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562581400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562cc2000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:35.463543+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307891 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:36.463748+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111140864 unmapped: 33103872 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:37.463926+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 110780416 unmapped: 33464320 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:38.464095+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:39.464271+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:40.464487+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342547 data_alloc: 218103808 data_used: 5341184
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:41.464753+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:42.464990+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:43.465235+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:44.465483+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111108096 unmapped: 33136640 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f96c9000/0x0/0x4ffc00000, data 0x1acc055/0x1b93000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:45.465707+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1342547 data_alloc: 218103808 data_used: 5341184
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111091712 unmapped: 33153024 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:46.466128+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111099904 unmapped: 33144832 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:47.466376+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 15.020989418s of 15.036432266s, submitted: 3
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112001024 unmapped: 32243712 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:48.466589+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115662848 unmapped: 28581888 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:49.466772+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:50.466943+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:51.467062+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:52.467207+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:53.467417+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:54.467600+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:55.467775+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:56.467967+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:57.468164+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:58.468331+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115671040 unmapped: 28573696 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:24:59.468865+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8e87000/0x0/0x4ffc00000, data 0x230e055/0x23d5000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:00.469015+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408965 data_alloc: 218103808 data_used: 5644288
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:01.469193+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115679232 unmapped: 28565504 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.164536476s of 14.331671715s, submitted: 56
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562581400 session 0x55a5601cba40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:02.469352+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562cc2000 session 0x55a5601c81e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a560219e00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:03.469589+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:04.470032+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:05.470276+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:06.470654+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:07.470844+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:08.471189+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:09.471429+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:10.471671+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:11.471839+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:12.472199+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a56021f860
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a562ed21e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:13.472680+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:14.472881+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:15.473043+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:16.473258+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:17.473555+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:18.473887+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:19.474079+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:20.474421+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9bba000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1271910 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:21.474740+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:22.475021+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:23.475166+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 112844800 unmapped: 31399936 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:24.475341+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562213c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 22.203777313s of 22.330352783s, submitted: 42
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213c00 session 0x55a5601ca960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562215800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562215800 session 0x55a562ed2b40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a55fee70e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560220000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a562e5f2c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:25.475572+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9689000/0x0/0x4ffc00000, data 0x1b0dfe3/0x1bd3000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343500 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:26.475888+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:27.476037+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:28.476274+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111247360 unmapped: 32997376 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:29.476545+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562202800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202800 session 0x55a562784780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56255c400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a563610800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111263744 unmapped: 32980992 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:30.476692+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1343804 data_alloc: 218103808 data_used: 339968
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 111198208 unmapped: 33046528 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:31.476869+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:32.477065+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:33.477280+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:34.477607+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:35.477740+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399892 data_alloc: 218103808 data_used: 8769536
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:36.477917+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.868194580s of 11.951797485s, submitted: 14
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:37.478279+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:38.478455+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:39.478595+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:40.478749+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f927c000/0x0/0x4ffc00000, data 0x1f1afe3/0x1fe0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x499f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1399760 data_alloc: 218103808 data_used: 8769536
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:41.478931+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 113803264 unmapped: 30441472 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:42.479091+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 115597312 unmapped: 28647424 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:43.479255+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121249792 unmapped: 22994944 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:44.479554+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121274368 unmapped: 22970368 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d6bc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d6bc00 session 0x55a563071680
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a563070960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a560c881e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:45.479763+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c59400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c59400 session 0x55a560c883c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 119250944 unmapped: 24993792 heap: 144244736 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562202800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202800 session 0x55a5601ca780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561d56800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d56800 session 0x55a560ef6960
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a561048d20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a5610485a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2b000 session 0x55a5610492c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1586678 data_alloc: 234881024 data_used: 9949184
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:46.480025+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:47.480197+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:48.480393+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:49.480564+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 31080448 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:50.480723+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 31072256 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562213800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213800 session 0x55a55fee6f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.895648956s of 14.161753654s, submitted: 94
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56257a400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56257a400 session 0x55a55fee7a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1586694 data_alloc: 234881024 data_used: 9949184
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:51.481046+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55fd2b000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55fd2b000 session 0x55a55fee6d20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c54800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c54800 session 0x55a55fee7c20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3fe3/0x3669000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:52.481176+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c58000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562213800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:53.481313+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 31088640 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:54.481444+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 124936192 unmapped: 27189248 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:55.481680+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1666876 data_alloc: 234881024 data_used: 21913600
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:56.481881+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:57.483201+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:58.483563+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 131973120 unmapped: 20152320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:25:59.483787+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:00.484074+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:01.484216+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1666876 data_alloc: 234881024 data_used: 21913600
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:02.484391+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f77e2000/0x0/0x4ffc00000, data 0x35a3ff3/0x366a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:03.484575+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:04.484988+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 132005888 unmapped: 20119552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 13.947762489s of 13.955580711s, submitted: 2
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:05.488650+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134848512 unmapped: 17276928 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f777a000/0x0/0x4ffc00000, data 0x360bff3/0x36d2000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6b8d000/0x0/0x4ffc00000, data 0x41f0ff3/0x42b7000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:06.489897+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1759978 data_alloc: 234881024 data_used: 22151168
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134922240 unmapped: 17203200 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:07.490049+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:08.491419+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:09.491651+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134914048 unmapped: 17211392 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6af2000/0x0/0x4ffc00000, data 0x4293ff3/0x435a000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:10.494253+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134946816 unmapped: 17178624 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:11.494515+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1771750 data_alloc: 234881024 data_used: 22212608
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 134946816 unmapped: 17178624 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:12.494818+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135184384 unmapped: 16941056 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ace000/0x0/0x4ffc00000, data 0x42b7ff3/0x437e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:13.494979+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135217152 unmapped: 16908288 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:14.496590+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135217152 unmapped: 16908288 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:15.497201+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135225344 unmapped: 16900096 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:16.497385+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1770406 data_alloc: 234881024 data_used: 22220800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135225344 unmapped: 16900096 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ace000/0x0/0x4ffc00000, data 0x42b7ff3/0x437e000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:17.497583+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135258112 unmapped: 16867328 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:18.497738+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135258112 unmapped: 16867328 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.216358185s of 14.454858780s, submitted: 91
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:19.497905+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135315456 unmapped: 16809984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:20.498267+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135315456 unmapped: 16809984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:21.498456+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1770158 data_alloc: 234881024 data_used: 22220800
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:22.498762+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:23.499039+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135323648 unmapped: 16801792 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c58000 session 0x55a5603c92c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562213800 session 0x55a560ef7a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:24.499215+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a55f6c5400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135340032 unmapped: 16785408 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:25.499469+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 135340032 unmapped: 16785408 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:26.499871+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1512596 data_alloc: 234881024 data_used: 9957376
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f6ac7000/0x0/0x4ffc00000, data 0x42beff3/0x4385000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8392000/0x0/0x4ffc00000, data 0x29f3ff3/0x2aba000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:27.500155+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:28.500403+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a55f6c5400 session 0x55a561048b40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:29.500913+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:30.501234+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:31.501436+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:32.501729+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:33.502172+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:34.502392+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:35.502584+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:36.505136+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:37.505346+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:38.505550+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:39.505687+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:40.505864+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8393000/0x0/0x4ffc00000, data 0x29f3fe3/0x2ab9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255c400 session 0x55a560ef72c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:41.506046+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a563610800 session 0x55a560c24f00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1511887 data_alloc: 234881024 data_used: 9957376
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:42.506228+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 127500288 unmapped: 24625152 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56255cc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 23.984561920s of 24.057754517s, submitted: 24
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:43.506412+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:44.506590+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:45.506803+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:46.507008+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120643584 unmapped: 31481856 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:47.507223+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255cc00 session 0x55a560219860
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:48.507393+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:49.507600+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:50.507797+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:51.507970+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:52.508118+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:53.508306+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:54.508487+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:55.508790+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:56.508980+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:57.509217+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:58.509423+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:26:59.509628+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:00.509797+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:01.509974+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:02.510139+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:03.510415+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:04.510579+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:05.510776+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:06.510975+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:07.511113+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:08.511291+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:09.511736+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:10.511912+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:11.512061+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:12.512178+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:13.512581+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:14.512751+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:15.512958+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:16.513153+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:17.513459+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:18.513689+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:19.513992+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:20.514153+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f97ab000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:21.514395+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1297839 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:22.514594+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562214c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:23.514984+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 40.494945526s of 40.619098663s, submitted: 20
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 31473664 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:24.515143+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120668160 unmapped: 31457280 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:25.515348+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120283136 unmapped: 31842304 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214c00 session 0x55a560ef7860
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a5631e0c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5631e0c00 session 0x55a562c565a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562214c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562214c00 session 0x55a560218000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:26.515585+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56255c400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1358694 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:27.515809+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:28.516003+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255c400 session 0x55a560219e00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a56255cc00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a56255cc00 session 0x55a560ef65a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:29.516286+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:30.516557+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a5631e0c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:31.516842+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1357294 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:32.517072+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9084000/0x0/0x4ffc00000, data 0x1d02fe3/0x1dc8000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:33.517255+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120274944 unmapped: 31850496 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.344947815s of 10.433979988s, submitted: 23
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:34.517452+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5631e0c00 session 0x55a563071a40
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:35.517648+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562202c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a561c53400
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:36.517897+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1359231 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120307712 unmapped: 31817728 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:37.518114+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120242176 unmapped: 31883264 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:38.518330+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:39.518660+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:40.518790+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:41.519002+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408783 data_alloc: 218103808 data_used: 7593984
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:42.519123+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:43.519260+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:44.519357+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:45.519526+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:46.519693+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1408783 data_alloc: 218103808 data_used: 7593984
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:47.519859+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:48.519995+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f9083000/0x0/0x4ffc00000, data 0x1d03006/0x1dc9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 31031296 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 14.895034790s of 14.912478447s, submitted: 5
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:49.520176+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123174912 unmapped: 28950528 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:50.520322+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:51.520607+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1462949 data_alloc: 218103808 data_used: 7610368
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:52.520779+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f895c000/0x0/0x4ffc00000, data 0x242a006/0x24f0000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:53.520921+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:54.521098+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125665280 unmapped: 26460160 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:55.521231+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:56.521435+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1467605 data_alloc: 218103808 data_used: 7610368
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:57.521641+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:58.521818+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f8863000/0x0/0x4ffc00000, data 0x2523006/0x25e9000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:27:59.521975+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 125829120 unmapped: 26296320 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:00.522108+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 11.162461281s of 11.450368881s, submitted: 61
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126320640 unmapped: 25804800 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:01.522239+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1473525 data_alloc: 218103808 data_used: 7856128
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:02.522388+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x25a0006/0x2666000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:03.522554+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:04.522713+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:05.522918+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87e6000/0x0/0x4ffc00000, data 0x25a0006/0x2666000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:06.523125+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1476015 data_alloc: 218103808 data_used: 7860224
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126361600 unmapped: 25763840 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:07.523337+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:08.523513+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:09.523677+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 126500864 unmapped: 25624576 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:10.523859+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4f87c4000/0x0/0x4ffc00000, data 0x25c2006/0x2688000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x4daf9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 10.021368027s of 10.088058472s, submitted: 14
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562202c00 session 0x55a5602214a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561c53400 session 0x55a560a9f0e0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562215c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 28925952 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:11.524010+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308597 data_alloc: 218103808 data_used: 311296
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 123199488 unmapped: 28925952 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:12.524188+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562215c00 session 0x55a5630703c0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:13.524602+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:14.525109+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:15.525415+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:16.525855+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:17.526086+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:18.526258+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:19.526482+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:20.526773+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:21.527023+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:22.527313+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:23.527796+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:24.528085+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:25.528460+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:26.528681+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:27.529017+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:28.529215+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:29.529454+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:30.529672+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:31.529989+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:32.530228+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:33.530376+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:34.530612+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:35.530749+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:36.530984+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:37.531213+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:38.531559+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:39.531783+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:40.531981+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:41.532158+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:42.532336+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:43.532523+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:44.532677+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:45.532812+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:46.533037+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:47.533213+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:48.533414+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:49.533590+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:50.533763+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:51.534243+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:52.534414+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:53.534606+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:54.534843+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:55.535065+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:56.535346+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:57.535599+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:58.535820+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:28:59.536056+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:00.536303+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:01.536585+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:02.536775+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:03.536977+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:04.537206+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:05.537395+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:06.537686+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:07.537942+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:08.538076+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:09.538375+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:10.538570+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122241024 unmapped: 29884416 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:11.538798+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:12.538970+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:13.539135+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:14.539325+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:15.539664+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:16.539929+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:17.540125+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:18.540380+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121987072 unmapped: 30138368 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:19.540586+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:20.540777+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:21.541029+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 2404.3 total, 600.0 interval
                                           Cumulative writes: 13K writes, 49K keys, 13K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.02 MB/s
                                           Cumulative WAL: 13K writes, 4008 syncs, 3.45 writes per sync, written: 0.04 GB, 0.02 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 3081 writes, 10K keys, 3081 commit groups, 1.0 writes per commit group, ingest: 9.92 MB, 0.02 MB/s
                                           Interval WAL: 3081 writes, 1324 syncs, 2.33 writes per sync, written: 0.01 GB, 0.02 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:22.541272+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:23.541460+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:24.541668+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:25.541886+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:26.542160+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:27.542352+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:28.542574+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:29.542732+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121995264 unmapped: 30130176 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:30.542898+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:31.543041+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:32.543234+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:33.543467+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:34.543683+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:35.543859+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:36.544123+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:37.544315+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:38.544470+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:39.544644+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:40.544758+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122003456 unmapped: 30121984 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:41.544885+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122044416 unmapped: 30081024 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'config show' '{prefix=config show}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:42.545051+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122052608 unmapped: 30072832 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:43.545191+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 30474240 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:44.545324+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121765888 unmapped: 30359552 heap: 152125440 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'log dump' '{prefix=log dump}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'log dump' '{prefix=log dump}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:45.545463+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'perf dump' '{prefix=perf dump}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'perf dump' '{prefix=perf dump}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'perf histogram dump' '{prefix=perf histogram dump}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'perf histogram dump' '{prefix=perf histogram dump}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 122126336 unmapped: 41041920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'perf schema' '{prefix=perf schema}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'perf schema' '{prefix=perf schema}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:46.545671+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:47.545868+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:48.546606+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:49.546790+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:50.547041+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:51.547241+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:52.547645+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:53.548044+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:54.548318+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121610240 unmapped: 41558016 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:55.548615+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:56.548848+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:57.549111+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:58.549272+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:29:59.549511+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:00.549715+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:01.549898+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:02.550070+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:03.550242+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:04.550399+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:05.550603+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:06.550908+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:07.551084+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:08.551345+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:09.551699+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:10.551999+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:11.552220+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:12.552421+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:13.552608+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121618432 unmapped: 41549824 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:14.552842+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 41541632 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:15.552962+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 41541632 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:16.553161+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121626624 unmapped: 41541632 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:17.553319+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:18.553766+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:19.554055+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:20.554337+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:21.554588+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:22.554821+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:23.555143+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:24.555480+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:25.555798+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:26.556661+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:27.556958+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:28.557272+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:29.557617+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121634816 unmapped: 41533440 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:30.557849+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:31.558124+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:32.558415+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:33.558696+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:34.558981+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:35.559301+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:36.560022+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:37.560224+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:38.560484+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:39.560894+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:40.561152+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:41.561432+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:42.561754+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:43.562050+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:44.562314+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:45.562900+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:46.563208+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:47.563683+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:48.563900+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:49.564127+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:50.564398+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:51.564589+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:52.564833+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121643008 unmapped: 41525248 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:53.565121+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:54.565340+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:55.566366+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:56.566619+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:57.567021+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:58.567244+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:30:59.568002+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:00.568384+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:01.568752+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:02.569006+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:03.569436+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:04.569671+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:05.569910+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:06.570193+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:07.570539+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:08.570975+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:09.571266+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:10.571552+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:11.571957+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:12.572196+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:13.572715+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121651200 unmapped: 41517056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:14.572939+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:15.573225+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:16.573827+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:17.574039+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:18.574222+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:19.574573+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:20.574766+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:00.260 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:21.574992+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:22.575176+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:23.575309+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:24.575661+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:25.575869+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:26.576109+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:27.576273+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:28.576473+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:29.576683+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121659392 unmapped: 41508864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:30.576866+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:31.577256+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:32.577561+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:33.577932+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:34.578181+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:35.578387+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:36.578742+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:37.578964+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:38.579149+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121667584 unmapped: 41500672 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:39.579365+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 41492480 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:40.579573+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 41492480 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:41.579776+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121675776 unmapped: 41492480 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:42.580019+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:43.580266+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:44.580461+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:45.580702+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:46.582186+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:47.582326+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:48.582527+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:49.582685+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:50.583163+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:51.583322+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:52.583599+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:53.583824+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:54.583999+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:55.584213+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:56.584425+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:57.584613+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121683968 unmapped: 41484288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:58.584801+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:31:59.584960+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:00.585107+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:01.585276+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:02.585415+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1308041 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:03.585566+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:04.585784+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7ca000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121692160 unmapped: 41476096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 234.130950928s of 234.467590332s, submitted: 41
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:05.586008+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121716736 unmapped: 41451520 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [0,1,0,1])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:06.586234+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121757696 unmapped: 41410560 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:07.586355+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120627200 unmapped: 42541056 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307821 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:08.586524+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120635392 unmapped: 42532864 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:09.586638+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120651776 unmapped: 42516480 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [0,0,0,0,0,0,0,0,1,1])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:10.586810+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120659968 unmapped: 42508288 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:11.587049+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120668160 unmapped: 42500096 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:12.587200+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120676352 unmapped: 42491904 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:13.587398+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120692736 unmapped: 42475520 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:14.587666+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120733696 unmapped: 42434560 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore(/var/lib/ceph/osd/ceph-0) _kv_sync_thread utilization: idle 5.367417336s of 10.023312569s, submitted: 306
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:15.587919+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120766464 unmapped: 42401792 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:16.588108+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120791040 unmapped: 42377216 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:17.588257+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120799232 unmapped: 42369024 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:18.588469+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120799232 unmapped: 42369024 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:19.588700+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120807424 unmapped: 42360832 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:20.588847+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120807424 unmapped: 42360832 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:21.589032+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120807424 unmapped: 42360832 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:22.589226+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120807424 unmapped: 42360832 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:23.589395+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120815616 unmapped: 42352640 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:24.589583+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120815616 unmapped: 42352640 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:25.589739+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120815616 unmapped: 42352640 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:26.589956+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:27.590108+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:28.590309+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:29.590544+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:30.590739+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120823808 unmapped: 42344448 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:31.590899+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:32.591057+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:33.591213+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:34.591370+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:35.591555+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120832000 unmapped: 42336256 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:36.591761+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:37.591894+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:38.592047+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:39.592170+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:40.592312+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120840192 unmapped: 42328064 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:41.592464+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:42.592657+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:43.592871+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:44.592988+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:45.593221+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:46.593454+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120848384 unmapped: 42319872 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:47.593633+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:48.593812+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:49.593999+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:50.594174+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:51.594321+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:52.594486+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:53.594671+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:54.594875+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120856576 unmapped: 42311680 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:55.595050+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:56.595349+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:57.595581+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:58.595840+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:32:59.596087+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120864768 unmapped: 42303488 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:00.596317+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:01.596585+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:02.596834+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:03.596996+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:04.597400+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:05.597605+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:06.597844+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:07.597978+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:08.598178+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:09.598342+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:10.598515+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:11.598747+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:12.598976+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:13.599148+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:14.599270+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:15.599430+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:16.599733+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:17.599939+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:18.600100+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:19.600284+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:20.600538+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:21.600666+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:22.600829+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:23.601043+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:24.601228+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:25.601384+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:26.601645+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:27.601828+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:28.601999+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:29.602330+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:30.602614+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:31.602790+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:32.602980+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:33.603154+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:34.603395+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:35.604640+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:36.605693+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:37.605998+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:38.606236+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:39.607608+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:40.608272+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:41.609365+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:42.610043+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:43.610598+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:44.610737+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:45.610913+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:46.611433+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:47.611870+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:48.612291+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:49.612534+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:50.612699+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:51.612954+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:52.613275+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:53.613591+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:54.613788+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:55.613994+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:56.614284+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:57.614630+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:58.614821+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:33:59.615101+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:00.615277+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:01.615622+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:02.615776+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:03.616019+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:04.616238+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:05.616439+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:06.616753+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:07.617034+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:08.617455+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:09.617832+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:10.618005+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:11.618198+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:12.618358+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:13.618555+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:14.618731+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:15.618844+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:16.619040+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:17.619235+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:18.619380+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:19.619572+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:20.619798+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:21.619953+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets getting new tickets!
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:22.620297+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _finish_auth 0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:22.622158+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:23.620411+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:24.620556+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:25.620727+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:26.620957+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:27.621082+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:28.621212+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:29.621383+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:30.621545+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:31.621691+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:32.621878+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:33.622041+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:34.622191+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:35.622372+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:36.622570+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:37.622727+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:38.622871+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:39.623042+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:40.623630+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:41.623776+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:42.624294+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:43.624654+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:44.625004+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:45.625589+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:46.626103+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:47.626446+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:48.626725+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:49.626905+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120872960 unmapped: 42295296 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:50.627138+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:51.627353+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:52.627584+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:53.627809+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:54.628062+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:55.628380+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:56.628657+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:57.628983+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:58.629268+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:34:59.629460+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:00.629685+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:01.629901+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:02.630193+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120881152 unmapped: 42287104 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:03.630428+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120889344 unmapped: 42278912 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:04.630607+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120889344 unmapped: 42278912 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:05.630844+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:06.631088+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:07.631402+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:08.631712+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:09.631921+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:10.632213+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:11.632643+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:12.632782+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:13.632941+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:14.633122+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:15.633338+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:16.633589+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:17.633784+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:18.633954+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:19.634087+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120897536 unmapped: 42270720 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:20.634244+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:21.634480+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:22.634711+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:23.634905+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:24.635072+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:25.635240+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:26.635464+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:27.635673+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:28.635859+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:29.636031+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:30.636177+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:31.636352+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:32.636609+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120905728 unmapped: 42262528 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:33.636823+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:34.636996+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:35.637202+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:36.637450+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:37.637626+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120913920 unmapped: 42254336 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:38.637831+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:39.638021+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:40.638263+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:41.638469+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:42.638705+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:43.638856+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:44.639653+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:45.640951+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:46.641641+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:47.642445+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:48.643028+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:49.643529+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:50.643820+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:51.644034+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:52.644319+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:53.644580+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:54.644774+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:55.644989+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:56.645434+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:57.645766+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:58.646058+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120922112 unmapped: 42246144 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:35:59.646587+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:00.646733+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:01.646891+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:02.647165+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:03.647650+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:04.647896+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:05.648304+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:06.648544+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:07.648783+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:08.649041+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120938496 unmapped: 42229760 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:09.649398+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:10.649637+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:11.649933+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:12.650299+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:13.650571+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:14.650695+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:15.650945+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:16.651188+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:17.651340+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:18.651529+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:19.651745+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:20.652102+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:21.652315+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:22.652587+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:23.652750+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:24.652948+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:25.653146+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:26.653370+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120946688 unmapped: 42221568 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:27.653566+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:28.653759+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:29.653940+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:30.654139+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:31.654319+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:32.654467+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:33.654648+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120954880 unmapped: 42213376 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:34.654802+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:35.654979+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:36.655196+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:37.655327+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:38.655527+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:39.655681+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:40.655910+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:41.656142+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:42.656363+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:43.656559+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:44.656711+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:45.656850+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120963072 unmapped: 42205184 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:46.657099+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:47.657342+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:48.657810+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:49.658048+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:50.658769+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:51.660911+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:52.661889+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:53.662968+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:54.663773+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:55.663930+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:56.664641+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:57.664807+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:58.665240+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120971264 unmapped: 42196992 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:36:59.665404+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:00.665544+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a562d3d800 session 0x55a562e5f4a0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562214c00
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:01.665734+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:02.665905+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:03.666052+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:04.666371+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:05.666512+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:06.666808+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:07.667009+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:08.667260+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:09.667450+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120987648 unmapped: 42180608 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:10.667558+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:11.667729+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:12.668021+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:13.668249+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:14.668443+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:15.668593+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:16.668834+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:17.669028+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:18.669245+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:19.669403+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:20.669548+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:21.669734+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 120995840 unmapped: 42172416 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:22.670256+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:23.670612+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:24.670955+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:25.671106+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:26.671555+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:27.671701+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:28.671907+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:29.672094+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121004032 unmapped: 42164224 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:30.672291+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:31.672650+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:32.672940+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:33.673083+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:34.673251+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:35.673388+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:36.673622+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:37.673802+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:38.674053+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:39.674237+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:40.674394+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:41.674572+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:42.674766+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121012224 unmapped: 42156032 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:43.674925+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121020416 unmapped: 42147840 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:44.675129+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121020416 unmapped: 42147840 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:45.675279+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121020416 unmapped: 42147840 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:46.675528+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:47.675693+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:48.675887+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:49.676117+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:50.676266+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:51.676386+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:52.676604+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:53.676764+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:54.676959+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:55.677129+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:56.677305+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:57.677474+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121028608 unmapped: 42139648 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:58.677684+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:37:59.677853+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:00.678023+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:01.678187+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:02.678292+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:03.678451+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:04.678611+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:05.678866+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:06.679136+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121036800 unmapped: 42131456 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:07.679278+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 42123264 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:08.679410+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 42123264 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:09.679648+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121044992 unmapped: 42123264 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:10.679849+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:11.680027+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:12.680201+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:13.680403+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:14.680599+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:15.680802+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:16.681037+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:17.681322+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:18.681521+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:19.681674+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:20.681866+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:21.682053+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:22.682267+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:23.682553+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:24.682716+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:25.682942+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:26.683223+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:27.683453+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:28.683598+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:29.683853+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:30.684048+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121053184 unmapped: 42115072 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:31.684246+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:32.684441+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:33.684629+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:34.684779+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:35.684959+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121061376 unmapped: 42106880 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:36.685169+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:37.685346+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:38.685568+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:39.685753+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:40.685936+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:41.686104+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:42.686441+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:43.686616+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:44.686777+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:45.686967+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:46.687208+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121069568 unmapped: 42098688 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:47.687470+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:48.687672+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:49.687859+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:50.688052+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:51.688189+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a5621f9800 session 0x55a56370c780
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a563d88000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 ms_handle_reset con 0x55a561d43400 session 0x55a560b98000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: handle_auth_request added challenge on 0x55a562202000
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:52.688348+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:53.688608+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:54.688795+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:55.688971+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:56.689182+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:57.689356+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:58.689470+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:38:59.689702+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:00.689928+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:01.690158+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:02.690411+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:03.690578+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:04.690771+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:05.691044+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:06.691291+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:07.691587+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:08.691768+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:09.692011+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121077760 unmapped: 42090496 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:10.692217+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:11.692434+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:12.692691+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:13.692907+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:14.693152+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121085952 unmapped: 42082304 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:15.693327+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:16.693585+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:17.693780+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:18.694002+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:19.694233+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121094144 unmapped: 42074112 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:20.694415+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:21.694590+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                           ** DB Stats **
                                           Uptime(secs): 3004.3 total, 600.0 interval
                                           Cumulative writes: 14K writes, 51K keys, 14K commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.01 MB/s
                                           Cumulative WAL: 14K writes, 4378 syncs, 3.33 writes per sync, written: 0.04 GB, 0.01 MB/s
                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                           Interval writes: 764 writes, 1182 keys, 764 commit groups, 1.0 writes per commit group, ingest: 0.38 MB, 0.00 MB/s
                                           Interval WAL: 764 writes, 370 syncs, 2.06 writes per sync, written: 0.00 GB, 0.00 MB/s
                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:22.694739+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:23.694924+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:24.695107+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:25.695277+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:26.695594+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:27.695786+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:28.695976+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:29.696198+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:30.696378+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121102336 unmapped: 42065920 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:31.697434+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 42057728 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:32.697968+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 42057728 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:33.698565+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121110528 unmapped: 42057728 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:34.699194+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:35.699563+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:36.699840+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:37.700018+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:38.700412+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:39.700763+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:40.701045+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:41.701284+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:42.701569+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:43.701775+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:44.701964+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:45.702134+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:46.702547+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:47.702992+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:48.703420+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121126912 unmapped: 42041344 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:49.703748+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 42033152 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:50.703990+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 42033152 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _send_mon_message to mon.compute-0 at v2:192.168.122.100:3300/0
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:51.704259+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121135104 unmapped: 42033152 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:52.704556+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:53.704869+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:54.705032+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:55.705199+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:56.705437+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:57.705621+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121143296 unmapped: 42024960 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:58.705824+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:39:59.706076+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:00.706313+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:01.706563+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:02.706824+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:03.707299+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:04.707622+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:05.707839+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:06.708157+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:07.708347+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:08.708535+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:09.708853+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:10.709158+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:11.709400+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121151488 unmapped: 42016768 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:12.709614+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 42008576 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:13.709833+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121159680 unmapped: 42008576 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:14.710144+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:15.710363+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:16.710650+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:17.710821+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:18.711029+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:19.711230+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:20.711460+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:21.711684+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:22.711813+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:23.711947+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:24.712074+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.285714
Jan 23 10:41:00 compute-1 ceph-osd[77616]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0555556
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: bluestore.MempoolThread _resize_shards cache_size: 2845415833 kv_alloc: 1207959552 kv_used: 2144 kv_onode_alloc: 234881024 kv_onode_used: 464 meta_alloc: 1140850688 meta_used: 1307749 data_alloc: 218103808 data_used: 307200
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:25.712208+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121167872 unmapped: 42000384 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:26.712407+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'config show' '{prefix=config show}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121421824 unmapped: 41746432 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}'
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:27.712607+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: osd.0 146 heartbeat osd_stat(store_statfs(0x4fa7cb000/0x0/0x4ffc00000, data 0x15dbfe3/0x16a1000, compress 0x0/0x0/0x0, omap 0x63b, meta 0x3d8f9c5), peers [1,2] op hist [])
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121290752 unmapped: 41877504 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:28.712827+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: prioritycache tune_memory target: 4294967296 mapped: 121364480 unmapped: 41803776 heap: 163168256 old mem: 2845415833 new mem: 2845415833
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: tick
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_tickets
Jan 23 10:41:00 compute-1 ceph-osd[77616]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-01-23T10:40:29.713006+0000)
Jan 23 10:41:00 compute-1 ceph-osd[77616]: do_command 'log dump' '{prefix=log dump}'
Jan 23 10:41:00 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:00 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:00 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:00.430 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:00 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Jan 23 10:41:00 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/716455899' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.27403 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3477516146' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: pgmap v1440: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.27200 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.17871 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/239682889' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1315781543' entity='client.admin' cmd=[{"prefix": "mgr metadata"}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.27418 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2044172642' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.27221 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.17886 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1745865103' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.27436 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2564409329' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/716455899' entity='client.admin' cmd=[{"prefix": "mgr module ls"}]: dispatch
Jan 23 10:41:00 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services"} v 0)
Jan 23 10:41:00 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/148398286' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:41:01 compute-1 nova_compute[225705]: 2026-01-23 10:41:01.160 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:01 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions"} v 0)
Jan 23 10:41:01 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1085532514' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.27239 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.17910 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.27457 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/875828868' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2414639810' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/148398286' entity='client.admin' cmd=[{"prefix": "mgr services"}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.17925 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.27254 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.27469 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.17937 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1085532514' entity='client.admin' cmd=[{"prefix": "mgr versions"}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: from='client.27269 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:01 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mon stat"} v 0)
Jan 23 10:41:01 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/712991638' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:41:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:01 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:41:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:41:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:41:02 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:02 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:41:02 compute-1 crontab[252154]: (root) LIST (root)
Jan 23 10:41:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:41:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:02.263 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:41:02 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:02 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:41:02 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:02.432 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:41:02 compute-1 nova_compute[225705]: 2026-01-23 10:41:02.651 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:02 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "node ls"} v 0)
Jan 23 10:41:02 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2934036653' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.27484 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: pgmap v1441: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1637687014' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.17949 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/887272108' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.27278 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.27496 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/712991638' entity='client.admin' cmd=[{"prefix": "mon stat"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/233774706' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.17961 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3110235279' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.27287 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3108936715' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1660365616' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/312762499' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/420082444' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Jan 23 10:41:03 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2272500806' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:41:03 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Jan 23 10:41:03 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3579800927' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Jan 23 10:41:04 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2527938535' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:04.267 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Jan 23 10:41:04 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3136483563' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:41:04 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:04 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:04 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:04.435 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Jan 23 10:41:04 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4081738896' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush show-tunables"} v 0)
Jan 23 10:41:04 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2279399780' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.27302 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3345460299' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1102334395' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2934036653' entity='client.admin' cmd=[{"prefix": "node ls"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.27314 -' entity='client.admin' cmd=[{"prefix": "balancer status detail", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2385619581' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/369562290' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3236167764' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2272500806' entity='client.admin' cmd=[{"prefix": "osd crush class ls"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1644214128' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/222612964' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3665764273' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4166326524' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1452223012' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3579800927' entity='client.admin' cmd=[{"prefix": "osd crush dump"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2110475920' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2527938535' entity='client.admin' cmd=[{"prefix": "log last", "channel": "cephadm", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/3690345591' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:41:04 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1233404613' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:41:05 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Jan 23 10:41:05 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1474099025' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Jan 23 10:41:05 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/265346978' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-1 systemd[1]: Starting Hostname Service...
Jan 23 10:41:05 compute-1 systemd[1]: Started Hostname Service.
Jan 23 10:41:05 compute-1 ceph-mon[80126]: pgmap v1442: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.27326 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3136483563' entity='client.admin' cmd=[{"prefix": "osd crush rule ls"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3117015144' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/674797361' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/924431456' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4081738896' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2279399780' entity='client.admin' cmd=[{"prefix": "osd crush show-tunables"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3359288182' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2663519361' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1474099025' entity='client.admin' cmd=[{"prefix": "osd crush tree", "show_shadow": true}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/265346978' entity='client.admin' cmd=[{"prefix": "mgr metadata", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2144523971' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4016238504' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:41:05 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3327562445' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:41:06 compute-1 nova_compute[225705]: 2026-01-23 10:41:06.164 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:06 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Jan 23 10:41:06 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/142027931' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:41:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:41:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:06.269 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:41:06 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:06 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:06 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:06.437 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:06 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Jan 23 10:41:06 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/516926078' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:41:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:41:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:06 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:41:07 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:07 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:41:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd utilization"} v 0)
Jan 23 10:41:07 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2337187949' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:41:07 compute-1 nova_compute[225705]: 2026-01-23 10:41:07.653 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:07 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Jan 23 10:41:07 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4288764517' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: pgmap v1443: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/631640421' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.18081 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/142027931' entity='client.admin' cmd=[{"prefix": "osd erasure-code-profile ls"}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1856915728' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.18099 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1078464107' entity='client.admin' cmd=[{"prefix": "mgr module ls", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.18105 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/444520388' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1979006449' entity='client.admin' cmd=[{"prefix": "osd metadata"}]: dispatch
Jan 23 10:41:07 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/516926078' entity='client.admin' cmd=[{"prefix": "mgr services", "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000026s ======
Jan 23 10:41:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:08.272 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000026s
Jan 23 10:41:08 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Jan 23 10:41:08 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1961194654' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:08 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:08 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:08.439 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.18120 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.27637 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.27655 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2337187949' entity='client.admin' cmd=[{"prefix": "osd utilization"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.27416 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.27664 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2907186737' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: pgmap v1444: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.18147 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.27673 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4288764517' entity='client.admin' cmd=[{"prefix": "mgr stat", "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1493572992' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.18165 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.27428 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.27691 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1961194654' entity='client.admin' cmd=[{"prefix": "mgr versions", "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2866924100' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2109945965' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:41:08 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4166585728' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:41:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:09 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "quorum_status"} v 0)
Jan 23 10:41:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/4171598296' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:41:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:09 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:09 compute-1 podman[253097]: 2026-01-23 10:41:09.715732631 +0000 UTC m=+0.119304440 container health_status 56c1e164ad4c504d2060539d1cdc2c76b9c0b23d0554351407a59d7d3e5eb162 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'cdc8d10f0e05d8a70b43cf26938a886cf76be4340fa6a898edc4cc90e10001b1-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79-29aa40701d2f6e5f4665dcf93e29bf33ce2a71a8f27381abc75bed21d3cc1e79'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:fa24ce4aa285e3632c86a53e8d0385d4c788d049da42dd06570ad9d44aae00de', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Jan 23 10:41:10 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.18177 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.27440 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.27703 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.27446 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/704825705' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.18195 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.27721 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.27461 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2035341214' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.27739 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/4171598296' entity='client.admin' cmd=[{"prefix": "quorum_status"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:10 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1386926707' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "versions"} v 0)
Jan 23 10:41:10 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/3513160830' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:41:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:10.276 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:10 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:10 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:10 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:10.441 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:10 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:10 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:10 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Jan 23 10:41:10 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2845937288' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-1 sudo[253285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:41:11 compute-1 sudo[253285]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:41:11 compute-1 sudo[253285]: pam_unix(sudo:session): session closed for user root
Jan 23 10:41:11 compute-1 sudo[253310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 check-host
Jan 23 10:41:11 compute-1 sudo[253310]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:41:11 compute-1 nova_compute[225705]: 2026-01-23 10:41:11.167 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:11 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Jan 23 10:41:11 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2100362027' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: pgmap v1445: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='client.27473 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='client.27775 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/4243432763' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/3513160830' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='client.27497 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='client.18273 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2845937288' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail", "format": "json-pretty"}]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3453822127' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:41:11 compute-1 sudo[253310]: pam_unix(sudo:session): session closed for user root
Jan 23 10:41:11 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:11 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:11 compute-1 sudo[253410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/true
Jan 23 10:41:11 compute-1 sudo[253410]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:41:11 compute-1 sudo[253410]: pam_unix(sudo:session): session closed for user root
Jan 23 10:41:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:11 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :NFS Server Now IN GRACE, duration 90
Jan 23 10:41:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_start_grace :STATE :EVENT :grace reload client info completed from backend
Jan 23 10:41:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] nfs_try_lift_grace :STATE :EVENT :check grace:reclaim complete(0) clid count(0)
Jan 23 10:41:12 compute-1 ceph-f3005f84-239a-55b6-a948-8f1fb592b920-nfs-cephfs-0-0-compute-1-bawllm[235535]: 23/01/2026 10:41:12 : epoch 69734c63 : compute-1 : ganesha.nfsd-2[main] rados_cluster_grace_enforcing :CLIENT ID :EVENT :rados_cluster_grace_enforcing: ret=-45
Jan 23 10:41:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.001000025s ======
Jan 23 10:41:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:12.280 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.001000025s
Jan 23 10:41:12 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:12 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:12 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:12.443 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:12 compute-1 nova_compute[225705]: 2026-01-23 10:41:12.656 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:12 compute-1 sudo[253513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Jan 23 10:41:12 compute-1 sudo[253513]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:41:12 compute-1 sudo[253513]: pam_unix(sudo:session): session closed for user root
Jan 23 10:41:12 compute-1 sudo[253541]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/f3005f84-239a-55b6-a948-8f1fb592b920/cephadm.1a8853661a9c1798390b8e8d13c27688c1b1327a075745af2ee40ac466f0ac36 --timeout 895 gather-facts
Jan 23 10:41:12 compute-1 sudo[253541]: pam_unix(sudo:session): session opened for user root(uid=0) by ceph-admin(uid=42477)
Jan 23 10:41:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='client.27503 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='client.27524 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2100362027' entity='client.admin' cmd=[{"prefix": "osd tree", "format": "json-pretty"}]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3307632022' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/230289181' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:41:13 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1826576521' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
Jan 23 10:41:13 compute-1 sudo[253541]: pam_unix(sudo:session): session closed for user root
Jan 23 10:41:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:13 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "config dump"} v 0)
Jan 23 10:41:13 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/852390323' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:41:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:14.284 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:14 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:14 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:14 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:14.445 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='client.27536 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: pgmap v1446: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 767 B/s rd, 0 op/s
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='client.18345 -' entity='client.admin' cmd=[{"prefix": "fs status", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/4160970063' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/2164360078' entity='client.admin' cmd=[{"prefix": "mds stat"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd='sessions' args=[]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='admin socket' entity='admin socket' cmd=sessions args=[]: finished
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/1744329486' entity='client.admin' cmd=[{"prefix": "mon dump"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' 
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='mgr.14604 192.168.122.100:0/2333519895' entity='mgr.compute-0.nbdygh' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/852390323' entity='client.admin' cmd=[{"prefix": "config dump"}]: dispatch
Jan 23 10:41:14 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Jan 23 10:41:14 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/2029845994' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:41:15 compute-1 ceph-mon[80126]: mon.compute-1@2(peon).osd e146 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Jan 23 10:41:15 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "df"} v 0)
Jan 23 10:41:15 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1996475410' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:41:15 compute-1 nova_compute[225705]: 2026-01-23 10:41:15.874 225709 DEBUG oslo_service.periodic_task [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Jan 23 10:41:15 compute-1 nova_compute[225705]: 2026-01-23 10:41:15.874 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Jan 23 10:41:15 compute-1 nova_compute[225705]: 2026-01-23 10:41:15.875 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Jan 23 10:41:15 compute-1 nova_compute[225705]: 2026-01-23 10:41:15.900 225709 DEBUG nova.compute.manager [None req-859c3c0c-8416-4039-bbdc-c9c6b602990a - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Jan 23 10:41:16 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs dump"} v 0)
Jan 23 10:41:16 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/1377865277' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:41:16 compute-1 nova_compute[225705]: 2026-01-23 10:41:16.172 225709 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 25 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Jan 23 10:41:16 compute-1 ceph-mon[80126]: pgmap v1447: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 511 B/s rd, 0 op/s
Jan 23 10:41:16 compute-1 ceph-mon[80126]: pgmap v1448: 353 pgs: 353 active+clean; 41 MiB data, 255 MiB used, 60 GiB / 60 GiB avail; 608 B/s rd, 0 op/s
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.18381 -' entity='client.admin' cmd=[{"prefix": "osd blocked-by", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.27907 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.27611 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.100:0/3699176065' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/2248239710' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/2029845994' entity='client.admin' cmd=[{"prefix": "df", "detail": "detail"}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.18405 -' entity='client.admin' cmd=[{"prefix": "osd df", "output_method": "tree", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/1348268149' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.101:0/1996475410' entity='client.admin' cmd=[{"prefix": "df"}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.18411 -' entity='client.admin' cmd=[{"prefix": "osd df", "target": ["mon-mgr", ""]}]: dispatch
Jan 23 10:41:16 compute-1 ceph-mon[80126]: from='client.? 192.168.122.102:0/254681425' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch
Jan 23 10:41:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.100 - anonymous [23/Jan/2026:10:41:16.288 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:16 compute-1 radosgw[83743]: ====== starting new request req=0x7f0a2f0505d0 =====
Jan 23 10:41:16 compute-1 radosgw[83743]: ====== req done req=0x7f0a2f0505d0 op status=0 http_status=200 latency=0.000000000s ======
Jan 23 10:41:16 compute-1 radosgw[83743]: beast: 0x7f0a2f0505d0: 192.168.122.102 - anonymous [23/Jan/2026:10:41:16.448 +0000] "HEAD / HTTP/1.0" 200 0 - - - latency=0.000000000s
Jan 23 10:41:16 compute-1 ceph-mon[80126]: mon.compute-1@2(peon) e3 handle_command mon_command({"prefix": "fs ls"} v 0)
Jan 23 10:41:16 compute-1 ceph-mon[80126]: log_channel(audit) log [DBG] : from='client.? 192.168.122.101:0/816947763' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch
